What’s in a name? When Apple introduced its 2019 iPhone models, the addition of the word ‘Pro’ made us frown. On paper, the iPhone 11 Pro is a logical evolution from its successors, the iPhone X and iPhone XS. We’ve been using the iPhone 11 Pro for almost three months now, and it’s time to make up the balance: is the iPhone 11 Pro indeed a phone for the professionals?
At this point, we’ll assume you are familiar enough with the form factor that Apple has been using for the iPhone X and iPhone XS, as it’s been reused for the iPhone 11 Pro. The same goes for the iPhone XS Max, which has been replaced by the iPhone 11 Pro Max. The main selling point of these series, especially when compared to the 2018 iPhone XR and this year’s iPhone 11, has always been the magnificent OLED screen. Let’s put all these phones in an overview for easy comparison:
|iPhone X||iPhone XR||iPhone XS|
|Screen||5.8” OLED||6.1” LCD||5.8” OLED|
|SoC||Apple A11 Bionic||Apple A12 Bionic||Apple A12 Bionic|
|Memory||3 GB||3 GB||4 GB|
|Storage||64 / 256 GB||64 / 128 / 256 GB||64 / 256 / 512 GB|
|Rear cameras||12MP f/1.8, 12MP f/2.4 tele||12MP f/1.8||12MP f/1.8, 12MP f/2.4 tele|
|Front cameras||7MP f/2.2||7MP f/2.2||7MP f/2.2|
|Dolby Digital Plus||No||No||No|
|XS Max||11||11 Pro||11 Pro Max|
|Screen||6.5” OLED||6.1” LCD||5.8” OLED||6.5” OLED|
|SoC||Apple A12 Bionic||Apple A13 Bionic||Apple A13 Bionic||Apple A13 Bionic|
|Memory||4 GB||4 GB||4 GB||4 GB|
|Storage||64 / 256 / 512 GB||64 / 128 / 256 GB||64 / 256 / 512 GB||64 / 256 / 512 GB|
|Rear cameras||12MP f/1.8, 12MP f/2.4 tele||12MP f/1.8, 12MP f/2.4 tele||12MP f/1.8, 12MP f/2.0 tele, 12MP f/2.4 ultawide||12MP f/1.8, 12MP f/2.0 tele, 12MP f/2.4 ultawide|
|Front cameras||7MP f/2.2||12MP f/2.2||12MP f/2.2||12MP f/2.2|
|Dolby Digital Plus||No||Yes||Yes||Yes|
If we closely look at these specifications, we don’t see a massive jump from the XS to the 11 Pro. We have a newer generation processor, the A13 Bionic; we see a third rear camera with ultrawide angle capabilities, and there’s now support for Dolby Atmos/Dolby Digital Plus. Do these improvements deserve the ‘Pro’ moniker?
The specification table only tells us half the story, because there actually are some big improvements under the hood. Most of them involve the camera and the processor, or rather, machine learning. The A13 Bionic is a serious improvement over the A12 Bionic – not in general CPU processing power, as we only see a modest performance increase here, but definitely on the GPU side (improvements up to 40% over the previous generation), in power efficiency, and by adding what Apple calls the Neural Engine.
The Neural Engine is a set of ‘machine learning accelerators’ that can calculate specific instructions up to six times faster than the CPU can. These instructions are quite useful when working with artificial intelligence or computational photography. To give you some real-world examples: the iPhone 11 Pro is massively faster in object recognition in photos, which comes in handy when you’re looking for certain photos in your Camera Roll. It’s massively faster when calculating depth information when you shoot in Portrait Mode, allowing for new features (such as adjusting the bokeh afterwards) and new creative modes. It’s massively faster when using augmented reality, allowing for precise positioning and life-like modelling in games and when looking at 3D models of products. It even allows for great computational photography tricks like Night Mode and Deep Fusion – more on that later.
Are all these improvements noticeable from a user perspective? Actually, they are. Compared to the iPhone XS, the interface is just as snappy and responsive when opening apps and swiping through information. But you’ll definitely notice a difference when you’re shooting photos, or when you’re opening resource-intensive apps and games. You rarely have to wait for your phone to be done with tasks, something that noticeably sets it apart from the XS.
Night Mode and Deep Fusion
If you’re looking up survey results on what consumers actually use their smartphone for, photography is usually in the top three. It’s not surprising that Apple’s biggest visible improvement to the iPhone 11 Pro is the addition of a third camera, enabling ultrawide angle photography. This is not a new development: plenty of competitors already feature an ultrawide camera in their phones, such as the Samsung Galaxy S10 series or the Huawei P30/Mate 30 series.
Even with the distorted fisheye effect that’s a side-effect of using ultrawide sensors, it’s much easier capturing landscapes or making sure everyone is included the picture, when shooting in smaller rooms. The ultrawide camera is mainly a creative camera. However, the sensor coupled to this ultrawide lens is decidedly underwhelming: images are grainy, and the f/2.4 aperture doesn’t let in much light, especially compared to the other two rear cameras. When shooting with ultrawide photos in less-than-ideal circumstances, it feels like you’re thrown back five years in terms of image quality.
But that’s just the visible part of Apple’s camera improvements. Much more interesting are Night Mode and Deep Fusion. Night Mode is easiest to sell to consumers: thanks to computational photography, the iPhone 11 Pro doesn’t shoot a single photo when it’s dark, but rather a series of photos at different exposures. This is then compiled into a single, surprisingly sharp and bright photo, which combines the best parts of all shots. The results are stunning. Whether you’re in a dimly lit restaurant or walking around on a dark street, Night Mode delivers usable photos – especially if you manage to keep your phone as steady as possible during shooting. Handheld shooting generally requires you to keep your phone steady for 2 to 3 seconds, using a tripod you could theoretically extend this to 30 seconds. The best thing is that Night Mode is automatically selected by the Camera app, but can also be disabled or manually adjusted.
Deep Fusion is harder to explain. Introduced in iOS 13.2, the idea is that when you press the shutter, the Neural Engine actually stitches together nine photos to create a sharper, higher-resolution photo. Intriguingly, eight of these photos are actually shot before you’ve even pressed the shutter – the ninth, which is the one you feel like you’re shooting, is actually a photo with longer exposure. At first glance, there’s not much difference between photos shot with and without Deep Fusion, but once you start zooming in and looking at highly detailed elements (like animal fur), you’ll definitely notice a quality difference. Deep Fusion happens invisible to the user and can only be enabled or disabled with a workaround, but the main takeaway is that the Neural Engine actually does its best to deliver stunning photos without the user noticing any of the ‘magic’.
Table of contents
- 1. Introduction
- 2. iOS 13
- 3. Conclusion