If there’s one smartphone camera feature we leave in 2025, it should be this
Maybe AI zoom won't go away, but it needs to change.
![]()
Ryan Haines / Android Authority
We’re officially a few days into 2026, and most people are busy making (or maybe already breaking) their New Year’s resolutions. I, on the other hand, am preoccupied with a different type of resolution: camera resolution. Alright, that was a terrible way to jump into my topic, but I’ll stand by it because it’s true.
What I really mean is that the things we can do with modern smartphone cameras are downright remarkable. We can take telephoto-zoomed macro shots, insert ourselves into images in-camera, and even reimagine the sky if we’re stuck on a gray, cloudy day. However, one thing seems to get worse with every attempt to improve it, and I think we need to step back.
It’s time to leave AI-powered camera zoom in 2025, and here’s why.
I’m all for long-range camera zoom, but…

Tushar Mehta / Android Authority
Listen — I think it’s great that smartphone cameras can now zoom in up to 100x at the press of a button, or rather the pinch of some fingers. I thought it was impressive when Samsung introduced it, and it was the whole reason my dad bought (and kept) the Samsung Galaxy S21 Ultra right up until its battery couldn’t last a day. I was also surprised when OnePlus decided to push the limit to 120x zoom on the OnePlus Open, but its partnership with Hasselblad delivered on that promise.
There was, of course, a healthy dose of digital processing at play — I wouldn’t expect a 1/2.0-inch 64MP telephoto sensor to tackle that kind of range by itself — but it felt like we’d found the sweet spot. Details at maximum range were decent, yet believable given the hardware they were taken on. Now, though, I feel like we’ve taken that sweet spot and kept pushing until it’s unbelievable — and not in a good way.
I want reality, not AI-generated details.
For example, OnePlus’s new flagship, the OnePlus 15, offers 120x zoom, just like its predecessor, the OnePlus 13. The new phone, however, has a much smaller telephoto sensor that handles the heavy lifting. It dropped from a 1/1.95-inch sensor to a 1/2.76-inch sensor, reducing megapixels and increasing optical zoom from 3x to 3.5x. So, what had to change for the zoom to (hopefully) stay the same? OnePlus had to rely on its new DetailMax Engine for processing, along with a substantial dose of AI.
The result, annoyingly, is a downgrade in zoom quality as seen below:
Essentially, what I want you to look at above is the straight lines. On the left, they’re pretty good, maybe a little oversharpened, but good. You can easily make out the panels surrounding the top of the tower, yet it doesn’t look like it was ripped from an old Tomb Raider game.
On the right, though, I’d draw your eye to the lines on the side of the Washington Monument. For those who haven’t been, the obelisk is composed of even, straight-edged blocks of granite and marble — the critical aspect being the straight edges. Here, it appears that the blocks on the lighter face of the structure have wavy lines, taking on almost the character of tree bark, unlike what’s actually there. And yes, I’m comparing an older 60x shot to a newer 30x shot and preferring the old one — that’s where we are.
But, if we switch to Google’s Pixel 10 Pro — a newer member of the 100x zoom club — we get a slightly different set of results.
Given simple shapes, Google’s Tensor G5 chip knows precisely what to do. It sharpens the letter and framework of Baltimore’s Domino Sugars sign and keeps the moon looking like, well, the moon. When you add organic life, like people or birds, though, it comes to a grinding halt. Rather than incorrectly generating either living thing, Google processes the surrounding area, cleaning up the gold cross, the flagpoles, and the grasses in their respective pictures.
While you could probably ignore the little birds, given their overall lack of detail, the people are much more difficult. They look especially artificial against an otherwise sharp scene, almost as if they were rendered merely to establish a sense of scale. If they weren’t there, though, the finished result would look pretty good, just like the shots to the right.
And this, in my opinion, is where things need to change.
If we’re keeping Pro Res Zoom (and more), here’s what needs to change

Stephen Headrick / Android Authority
Alright, I know I said that AI-powered camera zoom needs to be left in 2025, but the truth is that it won’t. There’s no way the likes of Google or OnePlus will turn their backs on this approach to camera zoom that they’re still fine-tuning. After all, processing has always been the name of the game for Google’s Pixel camera. But, if that’s the case, then I think a few things need to change on both sides.
For companies like OnePlus, which have decided to downgrade their camera sensors in the hope that AI will fill in the gaps, step one is to, well, undo that. AI still requires as much data as possible to create a polished final image, which means larger megapixels and, ideally, larger sensors. Honestly, I think that if OnePlus had kept the same telephoto sensor it had on the OnePlus 13 and paired it with the DetailMax Engine, it would have rendered a Washington Monument that looked like its intended marble, rather than the wood it ultimately turned out to be.
Whether or not that actually happens, the next thing that everyone should do — from Google to OnePlus to Samsung to anyone else — is figure out when to use AI and when to leave it on the shelf. For Google, I see that as focusing on people rather than scenery, which is the opposite of what it does currently.
If AI zoom is going to stick around, it has to be applied evenly.
What I mean is that Google’s Pro Res Zoom is designed for landscapes, architecture, and manufactured shapes. As such, when it detects a person or an animal in a zoomed-in snap, it processes around them rather than risking an error. This makes them stand out worse, as I detailed above. So, by switching the focus, Google would have to sacrifice a little detail in the inorganic, but it would make the organic look better (or less out of place) in the process.
Eventually, when its processing models have matured to the point where they could handle people and animals just as well as buildings and trees, I wouldn’t mind if Google switched back. In my mind, it would be like playing conservatively with the Nano Banana generator because you know that Nano Banana Pro is right around the corner — take it easy now for a bigger reward later.
Follow
Thank you for being part of our community. Read our Comment Policy before posting.