You are not alone if you have been perplexed by the images captured by your iPhone 14 Pro. YouTuber Marques (MKBHD) Brownlee claims to have figured out what’s wrong.
The latest round of Brownlee’s annual smartphone camera awards revealed Apple’s flagship gadget to be mired in the middle of the pack, as is often the case. Nonetheless, you might be excused for anticipating more from the brand-new 48-megapixel camera.
So, what’s going on here?
When it comes to software, everything else is secondary.
Brownlee uses the Google Pixel range of smartphones as an example of where Apple has to improve in a new YouTube video analysing the camera performance of the iPhone 14 Pro.
You should watch the whole thing to get the entire picture, but Brownlee mentions how Google has traditionally used the same camera sensor in most of its Pixel devices, relying on software to make photographs seem fantastic. In spite of a few hiccups in video performance, the combination of the same sensor and progressively better software resulted in some very stunning still images.
When Google tried using a new 50-megapixel sensor, though, the results were unexpected. The programme was overcompensating, resulting in a phoney, overprocessed picture. The same issue currently plagues Apple.
Over the course of many years, every iPhone utilised the same 12-megapixel camera sensor with Apple’s own software on top to iron out any problems. Like Google, Apple was able to take fantastic pictures while developing and improving that software.
The iPhone 14 Pro, however, brought about a shift. Apple is now making use of a sensor with a resolution of 48 megapixels, which is a significant improvement over previous models. But Brownlee thinks Apple’s software is overkill, as it continues to exert itself at the same level as it did when using a 12-megapixel sensor even if it is no longer necessary.
And what is the end result? A picture that seems to be fake. One that seems artificial and unnatural.
Where do we go from here? It’s Apple’s fault, and they’ll probably patch it in an update. Apple just has to back off somewhat and let the new sensor do the heavy lifting rather than its software.
Will the iPhone 14 Pro offer this feature? Time will tell. Is there any chance of a software upgrade for the greatest iPhone’s camera, or do we have to wait until the iPhone 15 Pro?