The Google Pixel 6 is causing quite a stir, and for good reason: both the standard model and the Pixel 6 Pro are the best phones Google has ever built. Powered by a Google-designed Tensor chip and with new camera hardware, as well as impressive photo editing features, the Pixel 6 is a phone you can’t ignore.
In fact, the Pixel 6 Pro is already selling out, leading people to ask where to buy a Pixel 6. And we hope Apple is paying attention.
- These are the best iPhones right now
- Rumor has it that iPhone 14 release date, specs, design and leaks
- Plus: Apple Says It Expects Shipping Delays To Continue This Holiday Season
Even given the strengths of the Pixel 6, the iPhone 13 Pro Max still can’t be shaken from its position as the best phone overall. And the rest of the iPhone 13 line also impresses. Still, the Pixel 6 shows that Google is serious about challenging Apple’s place at the forefront of the smartphone pecking order. Apple should be ready to show that it is just as serious to keep the iPhone ahead of rival devices.
While Apple certainly has its own ideas for what to do with future iPhones, the Pixel 6 certainly offers a lot in the inspiration department. These are the Pixel features that Apple should adapt for the iPhone 14.
Adaptive update rates for all iPhones
IPhone 13 Pro and iPhone 13 Pro Max users can enjoy the benefits of displays that can adjust their refresh rate up to 120Hz when on-screen activity would benefit from the faster speed, then reduce it to save battery life. But that’s a benefit iPhone 13 and iPhone 13 mini owners can only dream of.
Google didn’t make the same distinction with its two new phones. While the Pixel 6 doesn’t match the 120Hz adaptive rate that the Pixel 6 Pro is capable of, it can still scale up to 90Hz. Even that modest change makes a big difference when it comes to a smoother ride.
The good news is that Apple is likely to address this disparity with the iPhone 14, adding adaptive refresh rates to its less expensive models if the supply chain allows. Apple did better, because more and more, the standard iPhone is one of the few flagships that is still stuck at 60Hz.
A better camera shot
Camera bumps are becoming more prominent as phone makers have to squeeze in more optics to provide the photographic features that users demand. That has led to some pretty sizable matrices taking up space on the back of current flagship devices, something the iPhone 13 is certainly guilty of.
However, Google leaned towards a wide variety of cameras with the Pixel 6. Instead of limiting it to a corner, the company spread it across the phone. It certainly adds to the thickness of the Pixel 6, but it gives the phone a distinctive look that helps it stand out from other devices. Even better, the Pixel 6 is one of the few phones where you can lie on your back and not experience any kind of wobble.
Early rumors about the iPhone 14 suggest that Apple is likely sticking with its current phone design rather than spicing things up. There’s nothing especially wrong with that approach, and while we don’t see the iPhone 14 copying the look of the Pixel 6, it wouldn’t hurt Apple to find a different approach to house all the cameras it plans to include in future iPhones.
The return of a fingerprint reader
The Pixel 6’s in-display fingerprint sensor cannot take any responsibility for the favorable reviews the phone has received from Google; in fact, some of my colleagues argue that it is one of the worst things about the Pixel 6 due to its slow response time and frequent irritability. But at least the Pixel 6 has a fingerprint sensor. That’s not something the iPhone 13 can claim.
Yes, Apple’s Face ID is still the preferred method of unlocking devices and securing mobile payments. But that’s only when you’re not wearing a mask, and that could be something we end up doing for a long time, at least when we’re indoors. It is up to Apple to create a phone that reflects that reality, whether you place the fingerprint reader under the screen or inside the iPhone 14’s power button.
Better phone features
One area where the Tensor that powers the Pixel 6 really shines is the device’s Phone app. Leveraging on-board intelligence on the phone, you can now get wait times when dialing a toll-free customer service number, and the assistant can transcribe automated phone trees, making it easy to dial the number you need when placing a customer service call. .
Apple doesn’t offer any of that with the iPhone, though it probably could, given the amount of effort the company puts into the neural engine of its internal smartphone chipset. The iOS 15 update introduced a number of Siri enhancements, led by on-device speech processing and better context between requests. We imagine Apple could find a way to better incorporate its own assistant into the phone app.
It seems that fall phone launches from Apple and Google generally result in the two companies announcing dueling camera features that we’d like the other to adopt. For Apple and the iPhone 13, it was Cinematic mode, which offers all sorts of focusing tricks when shooting video with one of Apple’s new phones. For the Pixel 6, it is Magic Eraser.
Magic Eraser uses the intelligence of the Pixel 6 to identify people or objects in the background of a photo that may be cluttering things up, giving you the opportunity to remove those people with one tap. It’s not the only camera feature that relies on Google’s Tensor chip to make your photos look a little better, but it’s a lot of fun to use and is the best example of the Pixel 6’s enhanced computational photography.
Magic Eraser is very easy to use and the results have been impressive so far. It’s the kind of thing that we’d love for Apple to put its own spin on, especially since Cupertino has made the most of computational photography in its own right.
Best skin tones for everyone in photos
When I tested the Pixel 6, I compared the photos from its camera to the iPhone 13. Frankly, I thought the Apple phone did a better job with my skin tone, but then again, I’m a middle-aged white male. Historically, cameras have done a very good job of representing people who look like me, while being less accurate in the way they represent people of color.
The Pixel 6 seems to do something about it through its Real Tone feature. Google worked with experts in coloring and photography to better fine-tune its various algorithms and take better, more representative photos of all. It’s the right thing to do, and I’d like Apple to make a similar effort with its future camera phones.
- Plus: The best camera phones