When using the integrated Camera app, the iPhone 7 Plus doesn’t look like it’s using two camera and two lenses. Instead, it presents you a virtual camera that includes the input from both cameras the device has. You can even test this yourself by covering one of the lenses with a finger and see what’s actually happening.
Following several tests done by experts in the field, conclusions showed that the iOS makes decisions in your place in order to provide you with the best light for the photos and the least blur, instead of going for the highest resolution available. The goal of the device is to offer you a pic you like, so it prefers to go for a good-looking digitally zoomed one than an objectively bad shot.
However, the difference is not that striking. You can only notice it if you look at it on a 4K monitor or if you blow it up. So the device actually blows up pixels if it thinks it’s better this way, instead of going for noisy pixels.
This explanation shows why Apple offered the mostly unfiltered data about sensors available to photo-editing apps and camera developers. Professionals will never acknowledge digital zoom as being a good option, while other third-party apps will show the raw data they choose.
We had the chance to find out more during the September 7 keynote address held by Apple. However, this wasn’t really such an in-depth discussion as many photography enthusiasts would have wanted. There is still more to explore when it comes to multiple exposure, lenses or both of them, together with computational photography, in order to produce some synthesized video and images.
All in all, it seems that Apple is going to have to make some changes when it comes to photo quality in the future.