If you’ve been looking at the photos your iPhone 14 Pro is taking and wondering what is going on with them, you aren’t alone. Something’s amiss, and YouTuber Marques (MKBHD) Brownlee thinks he has the answer.
Brownlee recently ran his annual smartphone camera awards to see which device people think is taking the best photos and, as is often the case, Apple’s flagship found itself floundering in the middle of the pack. But given the new 48-megapixel camera, you’d be forgiven for expecting better.
So what’s going on?
It’s all about the software
In a new YouTube video looking at the iPhone 14 Pro’s camera performance, Brownlee points to the Google Pixel lineup as one example of where Apple needs to do some work.
You should definitely watch the full video to see what’s going on, but Brownlee points to the fact that Google historically relied on the same camera sensor across most of its Pixel devices — using software to then tweak photos to look pretty great. The combination of the same sensor and gradually improving software meant for some awesome photos — even if the video performance sometimes left a little to be desired.
But Google found that when it changed the sensor for a new 50-megapixel one, things didn’t look right. The software was doing too much work, creating an image that appeared artificial and overworked. And Apple now has the same problem.
For years, Apple’s iPhones all used the same 12-megapixel camera sensor with Apple layering its own software on top to work out any kinks. Just like Google, that allowed Apple to iterate and refine that software, taking great photos along the way.
But things changed with the iPhone 14 Pro. Now, Apple’s using a much higher resolution sensor — a 48-megapixel one, no less. But Brownlee believes that Apple’s software is going overboard, working just as hard as it did with that 12-megapixel sensor when in reality it doesn’t need to anymore.
The result? An artificial-looking photo. One that appears over-processed and just…off.
So what happens next? It’s all down to Apple and, in all likelihood, this will all get fixed in software. Apple just needs to dial things back a bit to let that new sensor do the work, not its software.
Whether that’ll come to the iPhone 14 Pro or not, we’ll have to wait and see. Will Apple’s best iPhone get a camera software update or will we all have to buy a new iPhone 15 Pro to see the fruits of Apple’s labor?