Google’s Pixel 6 camera smartens up snapshots with AI tools

Tech

Products You May Like

Google’s latest flagship phones have an impressive set of automated, AI-powered tools to help make your photos look better, with smart blurs, object removal, and skin tone exposure. While we’ll have to test them out to see if they work as advertised, they could be useful for everyone from pixel peepers to casual snapshot takers.

The new cameras themselves are pretty impressive to start with. The main rear camera, shared by the Pixel 6 and Pixel 6 Pro, is a 50-megapixel beast with decent-sized pixel wells and an F/1.85 equivalent aperture (no, it doesn’t capture as much light as an F/1.8 on a DSLR, but it’s still good). The ultrawide one, also shared, is 12 megapixels and f/2.2 on a smaller sensor, so don’t expect mind-blowing image quality. The 6 Pro gets a 48-megapixel telephoto with less low light capability but a 4x equivalent zoom. They’re all stabilized and have laser-assisted autofocus.

Basically if you want the best quality in any situation, stick to the main camera, but if you’re sure about your light go ahead and fire up the wide or zoom. It sounds like all the new camera features work on all the cameras, but generally speaking the better the shot to start with, the better the final result.

The simplest tool to use is probably “face deblur.” How many times have you gotten the perfect shot but it’s not quite sharp? The Pixel Camera will automatically always capture multiple exposures (it’s part of the ordinary process of taking a picture now), and combines the main shot from one camera with a clear shot of the face captured with another. To do it, you just tap on a shot in your gallery that isn’t quite sharp and if there’s a “face deblur” option: boom.

Comparison of two images, a blurry one and one where the face is sharpened.

Image Credits: Google

OK, it’s definitely kind of weird to have only the face sharp in a blurry photo, as you can see in the sample, but look: do you want the picture or not? Thought so.

Also in the blur department are two new “motion modes.” One is an “action pan” that assists in capturing a moving subject like a passing car clearly, while blurring the background “creatively.” That means it applies a directed zoom blur instead of the handheld blur it would normally have, so it looks a little ‘shoppy, if you will, but it’s a fun option. The other one is a long exposure helper that adds blur to moving subjects while keeping the background clear. Helpful for doing something like headlight streaks without a tripod. These will be found in their own motion mode area in the camera app.

An image on the beach before using 'magic eraser' and after, with background people removed.

Image Credits: Google

“Magic Eraser” is the most obviously “AI” thing here. If you take a picture and it’s great except someone just walked into the background or there’s a car parked in the scenic vista, it’ll help you zap those pesky real-world objects so you can forget they ever existed. Tap the tool and it’ll automatically highlight things you might want to remove, like distant people, cars, and according to the example they provided, even unsightly logs and other random features. Driftwood, though, on the beach…really? Fortunately you can pick which to throw in the memory hole, no pressure, or circle unrecognized objects and it will do its best to dispose of them.

“Speech Enhancement” isn’t for images, obviously, but when you’re in front camera mode you can opt to have the device tone down the ambient noise and focus on your voice. Basically Krisp by Google. If it works anywhere near as well you will probably want to use it all the time.

“Real Tone” is an interesting but potentially fraught feature that we’ll be looking into in more detail soon. Here’s how Google describes it: “We worked with a diverse set of expert image makers and photographers to tune our AWB [auto white balance], AE [auto exposure], and stray light algorithms to ensure that Google’s camera and imagery products work for everyone, of every skin tone.”

Photo of a family with dark skin sitting on the beach.

They look great, sure… but they’re models.

Basically they wanted to make sure that their “smart” camera’s core features don’t work better or look better on certain skin tones than others. This has happened many, many times before and it’s an insult and embarrassment when billion-dollar companies blow it over and over. Hopefully Real Tone works, but even if it does there is the fundamental question of whether it amounts to lightening or darkening someone’s skin in the photo — a sensitive matter for many people. “This feature cannot be turned off nor disabled,” Google says, so they must be confident. We’ll be testing this and talking with developers and photographers about the feature, so look for a deeper dive into this interesting but complex corner of the field.

It’s not entirely clear how many of these features will be available beyond the Pixel line of phones or when, but we’ll let you know what we find out.

Products You May Like

Articles You May Like

“I want to add a different twist” Final Fantasy 7 Remake Part 3 will rethink how it implements mini-games, says director Naoki Hamaguchi
Apple to Urge Judge to End US Smartphone Monopoly Case
Conan the Barbarian: Battle of the Black Stone – Volume 01 Issue 03
‘Nintendo Music’ Adds Another DS Soundtrack, Here’s Every Song Included
Will Fans Go See Both Wicked And Gladiator II? What Paul Mescal Has To Say About Wickediator