If you were to ask people what they want in a new smartphone, they would probably say they want it to be fast, have a long-lasting battery, be nice and durable, and have a really awesome camera.
It just so happens that’s what you get with the iPhone 11 Pro. It’s also what you get with the iPhone 11, which costs $300 to $400 less than the Pro. The Pro is certainly an upgrade, but it doesn’t do a lot to justify its “Pro” moniker or very “Pro” price.
Still, the iPhone 11 Pro is a tremendous iPhone, with some noticeable (and not-so-noticeable) improvements over last year’s iPhone XS. As good as it is at what it does, it’s hard not to think Apple could have done more to justify the big price gap over the iPhone 11 and the “Pro” name that comes with it.
Note: This review refers to the iPhone 11 Pro as a single entity, though we tested both the iPhone 11 Pro and iPhone 11 Pro Max. The only difference between the two phones is that the Max is larger and has a little bit longer battery life. It’s best to just think of the iPhone 11 Pro as a single product that comes in two sizes.
It’s all about that camera
With each new iPhone, the camera gets better. It’s often the thing people notice and care about more than anything else. That’s true this year more than most—while the iPhone 11 Pro has other improvements over previous iPhones, that camera grabs people’s attention.
Apple’s high-end iPhones have had a wide and telephoto camera duo on the rear of the phone for a couple years now. The iPhone 11 Pro adds a third, ultra wide, camera. It is, in a word, fun.
Landscape photographers will enjoy it and you can get taller panorama photos, but I think even average everyday users will find themselves using ultra wide quite often. You can get more people in a shot without backing up, or capture that big statue or sculpture without having to stand so far away that people walk in front of you. The distorted perspective effect of a wide lens makes subjects look larger, which can create a real sense of scale. If a telephoto lens makes things intimate, a really wide angle lens makes them expansive.
But the iPhone 11, the “non-Pro” model, has this same camera. It’s the telephoto camera that distinguishes Pro from non-Pro, and honestly, it’s just not that big a deal. I found it far more useful to zoom out than to zoom in. The telephoto camera is better now, with a wider f/2.0 aperture that lets in a lot more light than the f/2.4 telephoto camera in the iPhone X and XS. You’ll get better shots in poor light and a nicer natural bokeh.
The existence of the telephoto lens permitted the iPhone XS to do something the iPhone XR could not: shoot portrait mode photos of any subject. Now that both models have an ultrawide lens, they both gain that capability. It’s nice to take portrait mode photos with the standard wide-angle lens (which you could not on the iPhone X or XS), but it’s just one more way the Pro fails to distinguish itself from the standard model.
Better sensors and a much more powerful A13 Bionic processor combine to produce much better photos than the iPhone XS, which was already one of the best cameras on a smartphone. Detail and dynamic range is improved, and color accuracy is really on point—this phone produces some of the most true-to-life colors of any smartphone I’ve seen, while even the best Android phones sometimes get a little aggressive with making the colors “pop” and sharpen things up with a post-processing filter.
The selfie camera is now 12 megapixels instead of 7, with a field of view 15 degrees wider (85 instead of the 70-degree field of view on previous iPhones). Both make a significant difference. You’ll get clearer and sharper shots in more conditions, and group selfies are easier than ever. I appreciate that it’s not so wide as to be unnaturally distorting. It’s not the front-facing equivalent of the ultra wide camera on the back.
This wider front camera is supposed to enable a wider field of view for Face ID, too. While Face ID is lightning fast on the iPhone 11 Pro, the expanded field of view is hardly noticeable. You still have to do that awkward lean if it’s flat on your desk.
When you hold the phone upright, it automatically crops down to the old 70-degree view. You can make it wider again with just a tap, and rotating to landscape mode automatically switches to the wider angle (you can narrow it with just a tap). The front camera can record 4K 60fps video now, and even slow-motion video (which Apple insists on calling “slofies” and will probably have its 15 minutes of fame and rarely be used again).
Finally, there’s Night mode. When this was introduced on the Google Pixel and followed on other Android phones, iPhone users were understandably upset that they didn’t have the same feature. Now, Apple has their own version of Night mode, and it’s done in a very Apple way.
When it’s dark enough to warrant its use, Night mode automatically engages, and will ask you to hold your phone still from one to three seconds while the screen brightens, as if developing a photo. The resulting photos are often full of grain and noise, but the same shots without Night mode are even worse, in addition to being so dark that you can’t see anything at all. Night mode shots are much brighter and more colorful, but not unnaturally so, as we’ve seen in many Android phones—it doesn’t turn night into day, it just captures a shot that looks like what your eye might see at night.
Get ready for social media to be bombarded by night mode shots, because it really is a fantastic feature that makes impossible shots possible.
[See how Apple’s Night mode compares to the top Android phones.]
It’s also entirely automatic, instead of a separate mode. And while you can disable it, you unfortunately can’t force it on. That feels like an oversight; I’ve already run into several situations where I think Night mode would have helped but it was just bright enough not to engage. Night mode doesn’t work on the ultra wide camera for some unknown reason, and it doesn’t work on any previous iPhone—I think Apple could certainly solve both shortcomings, if it wanted to.
As good as the iPhone 11’s camera is, it’s about to get better. An upcoming “Deep Fusion” technology will take the computational photography capabilities of the camera to a whole new level. Apple promises some of the sharpest, most accurate shots we’ve ever seen, a dramatic improvement over the current photo processing, which is already quite impressive. We don’t know exactly when this camera upgrade is coming, only that it is scheduled for this fall.
The best smartphone for video gets even better
The iPhone XS had plenty of competition for still photo quality, but its overall video quality was top of the heap. Since then, the best Android phones have perhaps taken the crown, but the improvements in the iPhone 11 should be enough to steal it back.
You can shoot up to 4K resolution at 60 frames per second on the rear cameras while still benefitting from extended dynamic range and image stabilization—features that were only available up to 4K30 on last year’s models. Add in the ability to smoothly zoom in from ultrawide to telephoto and you’ve got a powerfully capable video device.
You can see small transitions when making the transition between the ultra wide, wide, and telephoto lenses, and each one has slightly different quality characteristics—they are different sensors with different lenses, after all. But Apple has done a very impressive job matching color and exposure as much as possible to create the smoothest transitions when switching cameras that I’ve ever seen.
Stealing a page from some of the latest Android phones, Apple has a new Audio Zoom feature. When you zoom in past 1x while recording video, background noise will be diminished and the audio will focus on your subject. It’s a really noticeable effect, especially when standing in a noisy environment like next to a street or a fountain, but it’s not so aggressive as to sound completely unnatural.
A better camera interface, too
The camera interface has been given some really thoughtful improvements.
Tap-and-hold on the shutter button and you’ll start recording a video rather than taking a series of photos in a burst. A swipe to one direction to lock video recording on, swipe the other to take that burst shot. I’ve needed to take a spontaneous video far more often than burst photos, and I think most users would appreciate this change.
The “dial” interface for smoothly zooming in and out is a much easier way to get precisely the right shot than pinch-to-zoom, too.
Now, features like changing the aspect ratio, choosing filters, and setting a timer are accessed in a features bar that appears when you swipe up on the camera modes and disappears when you swipe down. It’s a good place for this stuff, and leaves plenty of room for future expansion without cluttering the interface.
It would be really nice to change video resolution and frame rate in the camera interface instead of the Settings app, but Apple still hasn’t gotten the memo on that.
When you’re shooting with the telephoto or wide angle lens, the black bars on the side of the main viewfinder show what would be captured by the next widest lens. It’s a nice way to quickly assess whether you should take a wider shot.
Frustratingly, all of these improvements, save the wider-angle preview, could easily be brought to prior generation iPhones. They should have been part of iOS 13, not exclusive to the iPhone 11.
The fastest phone money can buy
Apple’s A-series processors are second to none. The A12 Bionic in last year’s iPhone XR and XS was the overall fastest mobile CPU on any smartphone, and had nearly the fastest GPU. This year, Apple says it has made significant improvements in the A13.
According to Apple, the new A13 Bionic uses TSMC’s new second-generation 7nm process, which improves energy efficiency and allows for higher clock speeds. The results are impressive. According to Apple, pretty much every part of the A13 is 20 percent faster than before: the CPU (high power and high efficiency cores), the GPU, and the Neural Engine. In addition, there are new machine learning accelerators in the CPU—separate from the Neural Engine—that perform matrix multiplication operations six times faster.
Power efficiency is improved, too. Apple says the Neural engine uses 15 percent less power, the big CPU cores use 30 percent less power, the little high-efficiency CPU cores use 40 percent less power, and so does the GPU. There’s a catch, though. Apple says these power improvements are, “for those applications and tasks that don’t need more performance than the A12.” In other words, the GPU uses 40 percent less power when running at the same speed as the GPU in the A12—when it clocks up to run 20 percent faster, that power savings is reduced or lost.
No matter how you slice it, this is a crazy-fast and very efficient mobile processor. Benchmarks aren’t the be-all and end-all of performance measurement, but they’re a good way to run the same exact tasks in the same exact way on different hardware. So let’s take a look at a few.
In the new Geekbench 5 test, the single-core performance of the A13 is about 20 percent higher than the A12, which was 20 percent faster than the A11. Where multi-core performance took a 16 percent leap from the A11 to the A12, this year it takes a 30 percent improvement in the A13. Compute performance keeps climbing by 40 percent year-on-year.
No other phone is even close to these numbers.
3D graphics performance takes a big leap in the iPhone 11 (perfect for those Apple Arcade games). In the 3DMark Sling Shot tests, the A12 didn’t improve performance over the A11—we theorized that it might be hitting a memory bandwidth bottleneck. Whatever the reason, the A13 is now much faster in this strenuous test—we’re talking 50 to 60 percent!
In the older Ice Storm Unlimited test, which is a better representation of simpler 3D games, the A12 was 17 percent faster than the A11, and the A13 is almost 30 percent faster than the A12.