This story is part of, CNET's collection of news, tips and advice around Apple's most popular product.
With the iPhone 14 Pro's new cameras, it's a good time to be a serious photographer trying to get the most out of a camera you can tuck in a pocket. Apple's newest smartphone, which packs a 48-megapixel sensor, delivers significant improvements in image quality.
I'm one of those people, shooting professionally some of the time and as a hobbyist the rest. I've been putting my iPhone 14 Pro through its paces, and after a week of shooting and pixel peeping, I'm impressed with the technology improvements. This phone's cameras are good for photo enthusiasts, not just for ordinary use.
Three highlights in the iPhone 14 Pro and Pro Max stand out: the main camera's 48-megapixel resolution; better image quality on the main and ultrawide angle cameras so photos look more natural; and the improved low-light performance on all three of the rear cameras. Mainstream folks should appreciate them, as my colleague Patrick Holland observes in his iPhone 14 Pro review and camera testing, but serious shooters can really benefit.
The ability to shoot better photos is one of the more obvious ways you can see advancements in the latest iPhone. You might not notice processor speeds or display quality improving from one year to the next, but camera quality shows progress more visibly. And competitively, with Samsung offering powerful 10x zoom lenses and Google pioneering computational photography, Apple has to work hard to keep iPhone fans loyal.
Fortunately, Apple has raised its game too. I've scrutinized hundreds of photos to compare my iPhone 14 Pro with the iPhone 13 Pro. Here's what I've learned.
The iPhone 14 Pro's 48-megapixel camera is great
My favorite improvement to the iPhone 14 Pro is the 48-megapixel sensor on the main camera, the one that gets the most use. I love diving into the details of each photo.
For most folks, the iPhone 14 Pro models will shoot at 12 megapixels, combining four pixels on the image sensor into one through a process called. Because Apple increased the sensor size, image quality improves compared with 12-megapixel shots on earlier phones.
But the more adventurous can shoot with all 48 pixels. That quadruples pixel count and triples file sizes but gives you the flexibility to crop or rotate your photos without losing detail and resolution.
If you like viewing or printing your photos in large sizes, having 48 megapixels is great. At 240 pixels per inch, a common setting for high-quality prints, you can print 48-megapixel photos at a 25.2x33.6 inch size instead of 12.6x16.8 inches for 12 megapixels.
To take 48-megapixel shots, you must use Apple's ProRaw format, an option enabled through the camera app's format settings. Many serious photographers already prefer that for its advantages in editing: better flexibility with color, exposure, sharpening. ProRaw is a computational raw format, meaning that it combines multiple frames into one photo and performs other tricks to squeeze as much image quality as possible out of a smartphone's relatively small sensor.
Low-light shooting is better on the iPhone 14 Pro
The bigger sensor on the iPhone 14 Pro's main camera improves shooting at nighttime or when conditions are dim. I compared a lot of low-light photos, many of them taken with Apple's night mode and some of them mounted on a tripod to eliminate problems from shaky hands.
The 14 Pro's main camera takes appreciably better shots, preserving more shadow detail and color than the 13 Pro. The dynamic range is also better, capturing a broader range between bright and dark. It's not what you'd get out of a full-frame SLR or mirrorless camera from the likes of Sony, Nikon or Canon, but it's impressive.
The comparison above shows the same nighttime shot, deliberately overexposed to reveal shortcomings in darker parts of the scene. The iPhone 13 Pro photo at left suffers from more noise, less detail, and worse color than the iPhone 14 Pro shot at right. Both were shot at 12-megapixel resolution with Apple's night mode.
The ultrawide camera gets a similar improvement from the 13 Pro to the 14 Pro, though its performance isn't as good as the main camera's. Nighttime shots peering into my house show less noise where it's dark and better detail and color everywhere.
If you edit your photos, that translates to more flexibility. You can boost shadows and ease blown-out highlights without introducing as many artifacts like noise speckles or posterization, where there's not enough data for smooth tonal gradations.
And the better dynamic range helps when it's bright,too, for example with bright skies that look more natural.
In very, very dark situations, I noticed the iPhone 14 Pro cranking up the image sensor ISO, a sensitivity setting. Lux, maker of the Halide camera app for iPhones, reports a top ISO of 12,768 for the iPhone 14 Pro's main camera compared with 7,616 on the 13 Pro.
The telephoto camera captures more detail
For several years, Apple has used an AI-based image analysis technique called Deep Fusion to preserve details and color in dim and dark lighting. In the latest iPhone 14 generation, Apple's Photonic Engine technology runs Deep Fusion earlier in the image processing pipeline to preserve texture and color better.
It works on all the cameras, but I appreciate it most on the telephoto lens that otherwise doesn't appear to have been changed from the 13 Pro to the 14 Pro.
In one shot of a houseplant I took in the evening, I could clearly see fine detail on the 14 Pro's shot that was absent with the 13 Pro. You can compare the two above.
For another photo of a dark oil painting I shot when it was dim, the iPhone 13 Pro chose to use its main camera and upscale the photo digitally, with predictably mushy results. The iPhone 14 Pro used its telephoto camera and captured vastly more detail, aided perhaps by the Photonic Engine and by improved image stabilization.
In dim conditions, though, I find the telephoto's autofocus to be just as unreliable on the 14 Pro as on the 13 Pro.
Apple didn't sacrifice image quality for 48 megapixels
Notably, Apple increased the main camera with a good balance of resolution and picture quality.
Increasing pixel count can require decreasing pixel size to fit them all on a sensor. The problem is that smaller pixels are worse when it comes to color, noise, and capturing a broad dynamic range between bright and dark areas of an image.
When using pixel binning to produce 12-megapixel images, the iPhone 14 Pro's pixels are effectively 65% larger than on the iPhone 13 Pro, and image quality improves accordingly. But while shooting at 48 megapixels, even though the pixels are 59% smaller than on the 13 Pro, they're still big enough to produce photos with marvelous detail and good color.
The small pixels would be a problem when conditions are dim. But when shooting in night mode or using a flash, the iPhone 14 Pro sidesteps the problem by shooting only 12-megapixel photos.
The 2x telephoto camera is a cool trick
A clever trick with the 48-megapixel camera is just using the central quarter of the image to take 12-megapixel shots with a narrower field of view. Apple shows this option as a 2x camera in the camera app.
It's a good idea because 2x zoom is often nicer than the more dramatic 3x telephoto camera for midrange subjects. It works with video too.
Optics nerds will rightly point out that the lens properties haven't changed, which means you get some wide-angle issues like a deeper depth of field that makes it harder to isolate portrait subjects from their backgrounds. Whatever. It's still a useful option, and the image quality is good enough when it's not dim.
The main camera has a usefully wider view
Apple broadened the main camera's field of view from an equivalent of 26mm focal length to a wider angle 24mm. Especially given the option to shoot at 48 megapixels, I think that's justified.
Many of us shoot indoors where it's impractical to walk backward to get everybody in a group shot, so a wider field of view is justified. You can always shoot with the ultrawide camera, but its worse image quality is really apparent when it's dim.
Shooting in 48 megapixels is slower
One downside to the high-resolution photos: It typically takes more than a second to take a 48-megapixel shot. After you tap the shutter button, it snaps a photo and the phone churns away for a moment before the shutter button becomes available again.
In contrast, taking the same scene at 12 megapixels is much snappier. If you're shooting fast-moving subjects, stick with the lower resolution.
I'd like an on-screen setting to switch between 12 and 48 megapixels so I don't have to dig into the camera app's format settings to change it. Lux's Halide app, updated for the iPhone 14 Pro, includes a toggle to switch between 12 and 48 megapixels. Even without a switch, I'd like to know which resolution I'm using so I don't bungle important shots.
But I understand Apple's preference for a simpler interface, uncluttered with buttons photographers might regret accidentally pressing. I'll mostly shoot at 48 megapixels.
With four times the pixels, editing also is slower. Even on my fairly fast, it takes a few seconds to render the 48-megapixel ProRaw images.
Apple dials back the oversharpening
One of my longstanding complaints about Apple photos is that it sharpens images too much, cranking up too high the mathematical algorithm that emphasizes contrasting edges. The result is a crispy look that looks artificial and distracting to my eye. Indeed, one reason I shoot ProRaw is so I can choose a lower level of sharpening.
I'm happy to report that Apple has eased back. Adobe Lightroom, the software I use to edit and catalog photos, shows sharpening set to 40 instead of the 50 Apple has used for years. I still often dial it back still further to 20 or 30 to produce a more natural, less digitally processed look.
Your preferences may vary, of course. Adobe points out that people like more sharpening, contrast and color saturation when they're looking at photos on smaller screens. Even if Apple is optimizing more for phone viewing, I still prefer sharpening that looks more natural to me.
Those 48-megapixel photos take up more space
If you're pondering how much storage to buy with a new iPhone, factor in that 48-megapixel images take up roughly three times as much space.
Apple says the 12-megapixel shots are about 25MB and 48-megapixel shots are about 75MB. That varies depending on whether you're shooting simple or complex scenes. The biggest sizes I found were, for one scene busy with lots of leaves, 43MB for 12 megapixels and 125MB for 48 megapixels. Both shots were in ProRaw and framed the same with a tripod.
With four times the pixels, editing is slower, too. Even on my fairly fast M1 Max MacBook Pro, it takes a few seconds to render the 48-megapixel ProRaw images.
And if you're shooting ProRes video, another high-end format from Apple, factor in even more space. One 18-minute video I just shot in ProRes gobbled up 27GB of storage space.
The iPhone 14 Pro's camera bump is huge
You can't get a big sensor into a phone without using a big lens, and the price you pay for the iPhone 14 Pro's better image quality is a chunkier phone. I carry a DSLR around to hikes, birthday parties and conferences, so you won't be surprised to hear I'm fine with Apple's choice.
Ryan Jones has tracked iPhone camera thickness over the years, and the 14 Pro has the biggest optics package yet. It protrudes 4.18mm beyond the rest of the 7.85mm thick iPhone 14 Pro body. In comparison, the iPhone 13 Pro's body was about the same at 7.65mm thick, but the cameras protruded 3.6mm.
Apple styles its three cameras to look like traditional cameras. It's a pain cleaning the pocket lint out from between the barrels, but the intermediate material would add more weight to an already hefty phone. And the protruding cylinders convey a message of photographic seriousness in the same way hulking telephoto lenses on traditional cameras do.
Just try not to let the camera size go to your head.