74 percent of Americans didn’t know Facebook collected their interests to target ads until Pew asked them about it

- ADVERTISEMENT -
FILE PHOTO: Facebook CEO Mark Zuckerberg, appears on stage during a town hall at Facebook’s headquarters in Menlo Park, California, U.S., September 27, 2015. REUTERS/Stephen Lam/File Photo

Facebook keeps a running list of things it has learned about you for advertisers. At this point, the list isn’t incredibly hard to find: go to your account settings, click on “ads,” and the list will appear, ready for you to peruse or modify as you see fit.

These lists have been public for a while. In pre-election 2016, The Washington Post compiled a list of 98 categories that Facebook might use to build a portrait of you for advertisers. Based on what you tell Facebook, the company might be able to zero in on your interest in dogs, for instance. Maybe it guesses that you’ve been recently shopping for a home or a mattress or a car. For some people, Facebook takes a guess at their political beliefs and “racial affinity.” Those data points are then used to allow advertisers to target ads to the people who might be most likely to click on them. It’s a core part of Facebook’s business.

But it has been trickier to know exactly how the public feels about this information. When the Pew Research Center set out to examine that question, it found that 74 percent of Americans didn’t even know that the list existed, until the survey instructed them on how to view it.

Nearly 9 in 10 (88 percent) Americans found that Facebook had generated some material for them on the ad preferences page, and 6 in 10 had 10 or more interests listed for them. Unsurprisingly, heavy or longtime users of Facebook were more likely to have more interests on their ad preferences pages. Overall, 59 percent said that those interests were accurate, that they reflected who they were in real life. (By contrast, 27 percent said the interests were not very or at all accurate.)

Once they had a chance to view this list, a slim majority – 51 percent – were not comfortable with Facebook collecting this information about them, according to the report, which was released Wednesday.

“We consistently find that there’s a paradox at the center of generalized privacy research,” said Lee Rainie, director of Internet and technology research at Pew. “Americans, being Americans, say that it matters, but they behave in a way that doesn’t indicate that it matters.”

The survey was conducted in 2018, several months after Facebook suspended Cambridge Analytica for improperly collecting data from Facebook users, a revelation that caused a major crisis of trust for the platform. The news was the catalyst for congressional hearings and an attempt to encourage users to quit Facebook. The company also announced that it would provide more information to users about how ads work on Facebook.

The Pew data suggest that, even as Facebook becomes an increasing subject of concern for Americans, “no matter how much effort people make in disclosing what’s available and make it clear that users can make choices, everybody isn’t picking up on that,” Rainie said.

In a statement, Facebook said it believed “Pew’s findings underscore the importance of transparency and control across the entire ad industry, and the need for more consumer education around the controls we place at people’s fingertips.” Facebook added that it planned to host more in-person events on privacy this year and continue to make its ad settings “easier to use.”

The survey found a relationship between the accuracy of the information Facebook collects and how Americans feel about it. More than three-quarters, 78 percent, of those who thought that Facebook’s listings for them were either not at all or not very accurate said they were uncomfortable with the list, while just less than half, 48 percent, of those who thought the list was accurate felt the same way.

“There are a lot of people who say, ‘I don’t want to be misunderstood,’ ” Rainie said. The data support a longtime observation that everyone has their own line when it comes to privacy concerns. There’s a constant, fluid trade-off between privacy and sharing.

“The trade-off between ‘do I share my information or not’ (is) very specific to the encounter – what’s being offered, how safe is my data, is the thing that I’m seeking inherently valuable … everybody’s got their own measures on that,” Rainie said.

Here are a couple of other findings from the study:

– About half (51 percent) of Americans are assigned a political label by Facebook. In the survey results, those labels were pretty evenly distributed among those deemed to be conservative, liberal and moderate. For those with a label, 73 percent said it was either very or somewhat accurate. Just over a quarter, 27 percent, said it wasn’t accurate.

– About 2 in 10, 21 percent, were assigned a “multicultural affinity” group. Facebook’s algorithm guessed that 43 percent of those had an interest in African-American culture, 43 percent were labeled as having an interest in Hispanic culture, and 10 percent were assigned as interested in Asian-American culture. Of those assigned one of these affinity groups, 60 percent said they had at least a “very” or “somewhat” strong affinity to that group (57 percent of those assigned a group said they identified themselves as a member of it). By contrast, 37 percent of those assigned a group said they did not have a strong affinity with it, and 39 percent said they did not consider themselves a member of that group.

The Pew Research Center poll was conducted Sept. 4 to Oct. 1 among a nationally representative sample of 964 U.S. adults who have a Facebook account. The margin of error for the full sample is plus or minus 3.4 percentage points.

Share

LEAVE A REPLY

Please enter your comment!
Please enter your name here