Cameras struggle with dark skin. Here’s how new smartphones stack up.

- ADVERTISEMENT -
Anthony Sturgis, a Bay Area local, and his girlfriend Michelle Nell at Union Square in San Francisco. MUST CREDIT: Washington Post photo by Chris Velazco.

Erika Young flew from Florida to San Francisco with her mother for two reasons: to meet up with a family member in the military and to see Michelle Obama speak. But when Young and her mom happened upon a massive Christmas tree in the city’s Union Square plaza, they had to stop and get some pictures with it.

Google’s Pixel 7 Pro has a 50-megapixel main camera and improved zoom features, but it can also salvage old, blurry photos. MUST CREDIT: Washington Post photo by Chris Velazco.

That’s when the back-and-forth began. For a few minutes, Young and her mother, Jwana Luckey, snapped pictures with different smartphones to see which produced the best results.

Today, the phones in our pockets can produce images with the kind of fidelity that can rival – and sometimes beat! – dedicated cameras. But even now, Young told me, people of color still struggle to feel fully represented in the photos and selfies they take. And that’s in part because our smartphones don’t always know how to handle Black and Brown faces.

“I think [clarity] is what a lot of people go for – they see a picture and they say ‘Hey, this looks clear,'” she said. “But does this look completely like me? Is it grasping my skin complexion? Is it grasping the way that my hair naturally looks?”

Of the many companies trying to make money by selling smartphones, Google – which according to research firm Canalys accounts for only a small percentage of phones shipped in the United States – has been the most open about making its cameras more inclusive. Starting in 2021, Google’s new Pixel smartphones have shipped with under-the-hood “Real Tone” camera modifications the company claims will help them take better, more satisfying pictures of subjects of color.

To test those claims, I took pictures of people visiting one of San Francisco’s holiday hotspots with Google’s $899 Pixel 7 Pro, and compared the results with photos from Samsung’s $1,199 Galaxy S22 Ultra and Apple’s $1,099 iPhone 14 Pro Max.

It didn’t take long before one thing became clear: companies such as Google haven’t completely solved the problem. (Not yet, anyway.) The proof is in the photos, and how some of their subjects felt about them.

But before that, you’ll need to understand how smartphone cameras do what they do.

– – –

Your phone’s camera makes decisions for you

– – –

Photos where the subjects are lit from behind can be especially tricky, and each phone handled the situation differently. MUST CREDIT: Washington Post photo by Chris Velazco.

Back in the old, purely manual days of film photography, getting a half-decent photo took some work. Apart from making sure you had the appropriate film, you might have to adjust the aperture of your lens and dial in how long you wanted the shutter to stay open before ever clicking a button. And then you had get them developed.

Your phone can do all that in the blink of an eye. But because your phone has massively more computing power than your old point-and-shoot does, it can also automatically tweak and process those images faster than you can notice. Meanwhile, more sophisticated smartphone cameras can pull off even more clever tricks, like capturing multiple exposures of the same scene and cobbling together the best bits of each.

In other words, you’re not alone when you tap the shutter button on your screen – you have a second-in-command in software form.

This approach to producing images, called computational photography, is one of the reasons you might have noticed your phone’s photos sometimes look brighter and more colorful than the real world. The problem, according to Google, is that some key technologies – like camera sensors and processing algorithms – that help define the way a subject of color looks in a photo were mostly trained using images of people with light skin.

To try to fix this, Google says it has made the piles of pictures that inform the way these technologies work and interact more diverse.

“Over the past year, we’ve added more than 10,000 images to the data sets used to tune Pixel’s camera,” Shenaz Zack, director of product management at Google, said when the Pixel 7 was unveiled earlier this year. “Through that work, we’ve tuned exposure and brightness to better represent darker skin tones in lowlight situations.”

Beyond that, Google also says its Pixel phones have also been tuned to better detect faces of people with darker skin, and to adjust an image’s white balance to more accurately render their skin.

The company is open about the fact that this is all still a work in progress, but we found a handful of people willing to let a total stranger take photos of them in the conditions Zack described, to help us try to see the difference Google claims.

– – –

Accuracy vs. appeal

– – –

To get a first feel for what these cameras could do, I wanted to shoot one photo that featured people bathed in the tree’s warm light, and a more technically tricky image in which the subjects were lit from behind. That’s when I met Anthony Sturgis – a Bay Area local and employee of a 3D printing company – and Michelle Nell, his long-distance girlfriend visiting San Francisco from South Africa.

Both preferred the Pixel’s results in the close-up shot, which Sturgis found especially surprising – he’s a Samsung guy, after all. But when it came to the photo with a well-lit tree behind them, Sturgis and Nell say they preferred the iPhone’s results.

-Our take: In the first set of photos, the Samsung phone let the tree’s yellow light overpower Sturgis and Nell’s natural skin tone, but the Pixel and the iPhone were pretty close. Meanwhile, the Samsung phone lightened the second photo a bit too dramatically. The Pixel photo most accurately depicted what I saw with my eyes, but the iPhone retained some of the warmth of their skin tones even while brightening their faces.

– – –

Lance Hopson and Denise Santoyo at Union Square in San Francisco. MUST CREDIT: Washington Post photo by Chris Velazco.

Overcast days, tricky colors

– – –

Dreary days aren’t just depressing – they can also wreak havoc on some photos. So how did the phones handle all this?

Every year before Christmas, Denise Santoyo and Lance Hopson drive up to San Francisco for a bit of shopping and people-watching – that’s where I caught the two of them posing for selfies.

Google’s Pixel 7 Pro seemed to struggle here – the image it produced doesn’t show off Luckey’s face as clearly as the others. MUST CREDIT: Washington Post photo by Chris Velazco.

“The Samsung photo and the Google Pixel photo to a lesser extent appear to be overexposed, resulting in excessive brightness,” Hopson said. “The iPhone photo seems to be the most accurate representation of us. Our skin tones are much closer to what we perceive them to be in this photo as compared to the other two. The colors are crisp/warm and come across as being very lifelike.”

-Our take: The Samsung photo looks almost a little purple compared to the others, and the phone automatically smoothed out some detail in Hopson and Santoyo’s faces. When it comes to the iPhone versus the Pixel, picking the “better” one comes down to preference, though the warmth of Denise’s skin definitely comes through more in the former.

– – –

Lance and Denise’s skin tones look noticeably different in each photo. MUST CREDIT: Washington Post photo by Chris Velazco.

Challenges in low light

– – –

And what about Young and her mom? For a real challenge, I took a photo at night with their backs to a bright, multistory Macy’s display.

Young said that while she liked the crispness of the photo taken with the iPhone, she preferred the more “natural” colors that came out of the Samsung phone. “I would definitely say our skin tone was shown better using the Samsung camera,” she added.

As for the Pixel, both of them found the resulting photo somewhat “washed out,” though Luckey also offered a more blunt description: she said it made her look “ashy.”

-Our take: The iPhone did a purely okay job at highlighting the contours of Luckey’s face, while the Galaxy’s more contrasty look worked quite well. Sadly, the Pixel really seemed to struggle, resulting in a photo where it’s hard to see Jwana’s face fully at all.

– – –

Google did not immediately respond to a request for comment on how these images are processed. Samsung and Apple weren’t immediately reachable for comment.

Bottom line, it’s not a surprise that people’s photo preferences were all over the place. Your sense of the way you like to look on camera is deeply personal, rooted in your relationship with yourself and your history. Those tastes are hard-won in a way, and may not align with what Google, Apple or Samsung think is the best way to make you look like you.

Erika Young and Jwana Luckey at Union Square in San Francisco. MUST CREDIT: Washington Post photo by Chris Velazco.

It’s all about taste, same as ever. And that means people probably aren’t going to stop editing their photos any time soon. But that doesn’t mean Google’s work is going unnoticed.

“Just to have anyone step out and be like, ‘Hey, we want to make sure you’re represented in a good light and we want to make sure that we capture everything’ – it’s definitely important,” Young said.

Share

LEAVE A REPLY

Please enter your comment!
Please enter your name here