As a professional photographer, I’ve been blown away by how the iPhone 11 Pro has been able to replace my DSLR on photo shoots. But Apple just introduced the , which includes the , and the camera tech is even better this time around. ( .) The iPhone 12 Pro Max’s camera updates especially, both in hardware and software, have already got me buzzing about going shooting with this thing. Here’s why I’m so excited.
More zoom with the 2.5x telephoto lens
I love the telephoto zoom lens on the iPhone 11 Pro, but at only 2x, it doesn’t always provide quite the level of zoom I want. I often find myself digitally zooming in further to get the exact composition I want. The iPhone 12 Pro Max takes that further, to 2.5x, which might not seem like a huge upgrade, but I think will be noticeable for many shots.
Would I have liked 5x or 10x? Sure, I love the bigger zoom on phones like the Galaxy S20 Ultra, but as they’re so big, I don’t use them quite as often as the 2x on the iPhone . Maybe 3x would have been a good compromise, but I still think that the 12 Pro Max’s upgrade will make a big difference to many of my photos.
Bigger image sensor
Apple already manages to squeeze incredible image quality out of tiny phone camera sensors, and its great software allows for the awesome night mode shots we’ve already seen. But a 47% larger sensor captures more light, allowing for brighter shots with less noise and better dynamic range. It’s why my professional camera (a Canon 5D Mk IV) uses a much bigger full-frame sensor.
I’m very keen to see just what a difference this larger sensor might mean not just for my nighttime photos, but for capturing fine details in landscapes, or up close when taking macro images of flowers. A bigger sensor paired with Apple’s image processing software is likely to be a potent combination.
Improved, faster lens for better night mode
It’s not just the sensor that can capture more light — the lens itself can let in more light than before thanks to its wider, f/1.6 aperture. That number basically means that the hole that light passes through is bigger than before, allowing more light to pass through in the same amount of time. Together with the larger sensor, Apple reckons the 12 Pro Max has seen an 87% improvement in low light imagery from the iPhone 11, which itself was already one of the best at low-light photos.
The redesigned lens isn’t just about letting more light in, though. Apple also explained in its launch presentation that it’s improved the optical clarity of the lens, reducing the amount of image distortion, particularly at the edges of the image on the widest lens. All of which means better-looking, more professional images. Lovely stuff.
Many of my best images I’ve taken with the iPhone have been taken in raw, using third-party apps. Raw images don’t save data like color information, or sharpening, allowing for greater control when editing in mobile apps like Adobe Lightroom Mobile. However, the downside of shooting raw in third-party apps is that you don’t get to take advantage of the image processing Apple uses in its own camera app. The Deep Fusion processing for amazing HDR, for example, is only something you’d get when shooting with the iPhone’s native camera.
To appeal more to pros, Apple has introduced Apple ProRaw in its camera app, which takes advantage of many of its image processing capabilities, but doesn’t permanently bake in data like white balance, allowing you to still make those changes in post production. Apple says it’s the best of both worlds, and on paper, I’m tempted to agree, but I’ll have to reserve my final judgment until I can not only shoot images in this new format, but also edit the images as well.
It’s worth noting that Google has done much the same already with its Pixel phones, with what it calls “computational raw,” which CNET senior editor Stephen Shankland calls “tremendous.” How the two compare remain to be seen.
HDR video and improved stabilization
It’s not just stills that have seen an improvement. The phone also now offers HDR with Dolby Vision at up to 60fps, which Apple says is the first time this has been offered on any device. In theory, this would help control bright highlights and help lift dark shadows, in just the same way that HDR does when you take still images.
The optical image stabilization has also been improved as well by moving the image sensor to counter movements and vibrations, rather than moving the heavier lens, as was the case before. How much difference this makes remains to be seen when I can take the phones out for a proper test, but with better image quality and better stabilization, I’m really excited about the sorts of videos I’ll be able to produce with this phone.