Reviews

iPhone 14’s camera may lose to Pixel 7 in photo quality, iPhone 13’s shaky camera performance shows

Written by admin

So I recently got a iPhone 13, which I’ve been using for a few weeks now, and I can’t help but notice that Apple’s phone is… lagging behind the competition in photography. Sometimes… far behind.

Like a Pixel 6 Pro user who likes to keep up to date with the latest smartphone camera technology by reading and watching too many articles and videos, the iPhone 13’s camera doesn’t seem to impress me. And that’s unfortunate for three main reasons:

  • iPhone is the most popular “camera” in the world, so it would be nice to make it as good as possible
  • iPhone used to take (if not the best) the most natural photos compared to other phones on the market
  • iPhone makes the best videos of any phone on the market, which means Apple already does halfway through (to provide the best camera on a phone)

On the first point, the popularity of the iPhone like a camera comes along with the smartphone revolution itself, but also because Apple makes only four iPhone models a year, with only two camera systems shared between them. And of course the iPhone simply sells better than any other Android flagship in the world and even more than many budget phones. There is not much to discuss about video recordings. Apple actually dominates this part of the camera experience forever now. You can go back and see our 2011 iPhone 4S vs Galaxy SII comparisonthen come back and check out our iPhone 13 vs Galaxy S22 camera comparisonand you will see that the gap has always been and still is…

And when it comes to “keeping it natural,” iPhones used to be known as the “most realistic” smartphone camera with natural colors, balanced sharpness, and very high dynamic range (HDR). However, it turns out that may not be the case anymore. Or at least compared to the competition.

So let’s talk about the iPhone’s problems in photography and see if there’s any chance iPhone 14 will bring Apple back to its glory days in this area of ​​user experience, which is now more important to customers than ever.

iPhone lags behind the competition in photography: Apple’s old and new camera problems

The photos of the iPhone 13 lack details compared to rivals

Starting with something very obvious, yet worth mentioning, I can’t avoid discussing the relative lack of detail in the photos of the iPhone 13, compared to the Pixel 6 Pro, the headphones I used for this comparison. will use.

Apple’s iPhone 13 tries hard to overcompensate for the lack of megapixels, and it shows in the over-sharpened branches and leaves (if you zoom in), as well as in the towel shot (which you’ll find further down the page).

On the other hand, the Pixel 6 Pro uses a 50MP sensor for its primary camera, which is there to collect all the details it can get. The output file is still a 12MP image, which is technically similar to the iPhone 13, but with added detail.

Google’s pixel-binned images combined with the company’s clever software improvements make for a much more detailed picture. The important thing to note here is that we’re not talking artificial sharpening, which was seen on the iPhone, but real details – reminiscent of a DSLR shot. The Pixel has that. Not the iPhone.

iPhone 13 often has problems displaying accurate colors and white balance

The other area iPhone 13 struggles with is with colors, white balance, and human tones, but let’s take it step by step…

If you’re anything like me, you’d remember the “glory days” of the iPhone camera, when Apple’s device captured the most accurate and natural photos. However, Cupertino had to give in to the demand for “Instagram-ready photos,” and it shows.

Ever since the iPhone 11, and especially the iPhone 12 and 13 series, Apple has been dragging the saturation slider a little too brave for my taste, and almost to the point where photos taken with a Galaxy could look more realistic than those taken on the iPhone. That’s quite ironic, because five years ago it was the other way around. For the record, iPhone XS was probably the last iPhone to have Apple’s priority for natural images at its core.

But that is not everything…

iPhone 13 photos can make things and people look yellow or blue

Aside from the over-saturation, there’s a noticeable yellowish tint to Apple’s standard image processing that may appeal to some. You could argue that it makes certain images more vivid, but it also makes them undeniably less accurate for the scene, and that’s not very “pro”.

But things quickly get really bizarre when you discover that the iPhone 13 can also exaggerate blue highlights, making walls, buildings, and even people’s faces appear… blue. It happens when the scene has a blue light or a clear blue sky, and it appears to be part of Apple’s Smart HDR4 feature, which ironically aims to make your photos more balanced…

And finally, on the subject of people and faces, I can’t help but mention the iPhone’s tendency to make people look a bit red/orange – which is when they don’t look blue. It’s noticeable on any skin tone, but especially in photos that feature darker-skinned people, which turn orange (lighter skin tones may look red, as if the person has been sunburned).

For the record, I think this makes some selfies more appealing, which could be Apple’s idea behind it. However, I still prefer it to be accurate, and let me decide if I want to edit the photos for myself, especially those from the rear cameras.

The iPhone 13’s dynamic range is far from the best on the market

The HDR or high dynamic range of the iPhone 13 is on the weaker side, which means it’s not all that high† The iPhone is behind phones like the Google Pixel 6 and Galaxy S22 and what appears to be two generations behind the leader in this field, the Vivo X80 Pro, which you can see in the last photo in the gallery (courtesy of Ben Sin

I remember the days when the iPhone XS had the best HDR on the market for selfies, rear camera photos, portraits, and videos — of all cameras. Apple used to be in the lead in this area, but Android phones have moved fast with hardware and software upgrades, and now Cupertino’s dynamic range for photos seems mediocre by comparison.

Interestingly, the iPhone is still unrivaled when it comes to HDR when shooting video, showing that the problem isn’t with the A-series chips that power the iPhone’s camera. They are powerful enough.

iPhone 13 falls short on ultra-wide-angle cameras and zoom too, but iPhone 14 Pro could bring Apple back into the top 3

The truth is that even if I choose not to focus on the more minor imperfections of the iPhone 13, zoom out reveals some fundamental problems with Apple’s images…

As it stands, phones like the Pixel 6, Galaxy S22 Ultra, and Vivo X80 Pro take better photos than the iPhone 13, which, as discussed, can mess up white balance, skin tones, and dynamic range. The difference in the primary lens can be seen across the board, but the gap between the iPhone and the Vivo is huge when it comes to ultra-wide-angle shots. For the record, Samsung and Google’s UWA cameras don’t impress either.

Optical zoom is also not present on the iPhone 13 and iPhone 13 mini and certainly will not come to iPhone 14 and iPhone 14 Max. That doesn’t excuse Apple’s poor performance in the zoom department, though. Google and the Pixel 6 have proven that taking usable photos with 2-5x zoom without a special telephoto lens is possible. Check out the examples above, which will tell you exactly how much Apple needs that 48MP sensor. Speaking of which…

iPhone 14 Pro with new 48MP camera is Apple’s chance to make iPhone camera great again

But there is light at the end of the tunnel for Apple, and it’s called iPhone 14 Pro…

As you may know, it is now almost certain that the new pro iPhone will have a brand new 48MP sensor that can collect more light and most likely make use of pixel binningwhich is the same trick that allows phones like the Pixel 6 and Galaxy S22 Ultra to take more detailed photos in mixed conditions and low light.

More light also means an opportunity for improved HDR and colors, especially in mixed and low light, and more resolution means Apple will finally have a chance to compete with Google’s SuperRes Zoom, which is currently sweeping the floor with iPhone 13.

That said, many of the image quality challenges I talked about aren’t just hardware-related. In other words, the 48MP sensor on iPhone 14 Pro and iPhone 14 Pro Max won’t magically fix the iPhone’s camera imperfections if Apple isn’t up to it. image processingwhich lags far behind the competition.

Let alone iPhone 14 and iPhone 14 Max are expected to stick to 12 MP shooters, meaning those on a budget won’t enjoy the best iPhone camera experience. The gap between the vanilla and Pro iPhones in 2022 will be the biggest ever, thanks to the iPhone 14 Pro’s exclusive A16 Bionic chip, a new front design and, of course, a 48MP camera.

So right now, as someone who likes smartphone photography, I wish iPhone 14 Pro could give me the photo capabilities of the Pixel 6 Pro and keep the video performance of the iPhone. That would be almost ideal. OK, and Vivo’s ultra-wide-angle camera, if Apple is generous.

Take a hint, Tim!

About the author

admin

Leave a Comment