How 11 people try to stop fake news in the world’s largest election


One of the operations most vital to Facebook at this moment is a world away from its Menlo Park, California, headquarters, and in more ways than one. Instead of the sprawling roof gardens and upscale cafes packed with Silicon Valley’s latest health fads, this cramped Mumbai office has worn carpets and fading walls lined with exposed electrical ducts. This is Boom Live, one of seven tiny fact-checking firms at the heart of Facebook’s efforts to rebuild some of its credibility during India’s elections.

Voters line up to cast their votes outside a polling station during the first phase of general election in Alipurduar district in the eastern state of West Bengal, India, April 11, 2019. REUTERS/Rupak De Chowdhuri

The world’s largest democracy represents a key proving ground for Silicon Valley’s battered disinformation amplifier. Based on the early tallies, more than 60 percent of India’s 900 million eligible voters are expected to cast ballots between now and May 19, as the center-left Congress Party tries to seize power from the right-wing Bharatiya Janata Party. As in other elections around the world, paid hacks and party zealots are churning out propaganda on Facebook and the company’s WhatsApp messenger, along with Twitter, YouTube, TikTok, and other ubiquitous communication channels. Together with Facebook’s automated filters, Boom’s 11 fact-checkers and its similar-size fellow contractors are the front line of the social network’s shield against this sludge.

“In a country largely driven by local and community news, we knew it was critical to have fact-checking partners who could review content across regions and languages,” Ajit Mohan, Facebook’s managing director and vice president in India, wrote in a recent company blog post.

Facebook’s third-party fact-checkers in India analyze news in 10 of India’s 23 official languages, more than any other country, according to a spokesperson.

“Fact-checking is part of a broader strategy to fight false news that includes extensive work to remove fake accounts; cut off incentives to the financially-motivated actors that spread misinformation; promote news literacy; and give more context about the posts they see,” the company said in a statement.

Facebook has said that fighting misinformation is a top priority, and that it hands such critical responsibilities over to contractors to help it keep a better-informed watch around the world at all hours. Contractors also work for much less than the typical Facebook employee, can appear more objective than the company’s own employees, and can make for easier scapegoats if needed.

A visit to Boom’s offices makes clear that the scale of Facebook’s response in India so far isn’t enough. The small team appears capable and hardworking almost to a fault, but given the scale of the problem, they might as well be sifting grains of sand from a toxic beach.

“What can 11 people do,” Boom Deputy Editor Karen Rebelo says, “when hundreds of millions of first-time smartphone-internet users avidly share every suspect video and fake tidbit that comes their way?” Her team has been working for Facebook since a regional election last summer, and work related to the present election escalated earlier this year.

According to Facebook, the fact-checkers are just one element of its 18-month campaign to safeguard India’s elections, which has included opening a version of its 2018 U.S. election “war room” in Delhi, making political advertising more transparent, and deleting hundreds of local pages and accounts spreading election-related misinformation. “We are absolutely committed to maintaining the integrity of the elections in India and will continue to work with local organizations, government groups and experts to make that happen,” Mohan wrote in the blog post.

Rebelo’s team and its counterparts have some advantages, including access to internal Facebook software that alerts the teams to posts it deems suspiciously popular. The fact-checkers also scroll through lists of complaints they’ve received from users about questionable forwarded WhatsApp messages. The Boom team spends much of its time working to verify or debunk a punch list of vitriolic Facebook posts and WhatsApp forwards that Rebelo compiles each morning based on these tools, her scouring, and tips. Cries of “Yeh dekha hai kya?” (“Have you seen this?”) pierce the otherwise still air a couple times each hour, when someone chances upon something especially outrageous.

Boom’s fact-checkers often find their cause Sisyphean. Hours after they built the case needed to persuade Facebook to take down a right-wing page called Postcard News, known for its alleged stream of bogus news, a fan page popped up to resume sharing Postcard videos with the creators’ millions of followers. It’s still up.

And as in the U.S. and elsewhere, bad actors have proven capable of persuasively editing real videos to fool a mass audience. The violent frenzy that led to the lynching of at least two dozen innocent men in India last year began with, of all things, a public safety announcement. In the spot, two actors on a motorcycle grab a child off the street, and then a voiceover explains that this was just a dramatization and warns parents not to leave small children unattended. Someone edited out the end of the video and added more alarmist voiceover in a dozen Indian languages to different versions, suggesting to viewers on Facebook and WhatsApp that child-stealing gangs posed a present danger to the country’s youth. “It set off a nationwide hysteria,” Rebelo says.

Things will only get worse, says DD Mishra, a senior director at researcher Gartner Inc. By 2022, the majority of individuals in mature economies will consume more false information than true information, according to Gartner’s research. “In the near future, AI-driven creation of ‘counterfeit reality’ or fake content will outpace AI’s ability to detect it, fomenting digital distrust,” says Mishra.

Such a trend would make true armies of human fact-checkers all the more important, given that companies like Facebook have been relying in large part on software to filter misinformation. Already, Boom’s small team faces unrelenting waves of the stuff each day. One recent surge began when Congress Party leader Rahul Gandhi announced plans to stand for election from two distinct constituencies. A viral photo from a Gandhi speech mislabeled flags from a center-right regional party as the national flag of Pakistan, implying that sympathizers with India’s bitter rival supported the opposition leader. More subtly, fact-checks showed that a widely circulated image of star Congress campaigner Priyanka Gandhi had been altered to add a prominent cross, suggesting she didn’t practice India’s majority religion.

So much has changed in India since the last election in April 2014, when Facebook had just acquired WhatsApp and had about 100 million users in-country. Today, Facebook and WhatsApp are each estimated to have more than 300 million Indian users. Boom Live got its start in 2016, before the terms “fake news” and “deepfakes” had entered the average person’s lexicon. But cheap smartphones and data have led to an unprecedented explosion of Indian internet use since then, says Govindraj Ethiraj, the fact-checking company’s founder and a former TV anchor. With that change have come the usual drawbacks.

Ethiraj says Boom is breaking even but has hit its staffing limit for now. (The company declined to comment on the value of its Facebook contract, citing a nondisclosure agreement.) To shore up revenue, Boom also sells fact-checking training classes and contributes material to a weekly TV show about viral fakes. Working to combat online misinformation today, Ethiraj says, is “like battling a many-headed Hydra while swimming in a tsunami of slime.” He hired Rebelo, a former business journalist who says she wanted to do something “more real,” to run the team two years ago.

The fact-checkers themselves have struggled to cope with the deluge of hatred they encounter online, and share war stories that would sound familiar to the content moderators for Facebook and other services who’ve reported symptoms akin to post-traumatic stress disorder. “I feel the craziness every single day,” says Mohammed Kudrati, 22, who joined Boom’s fact-checking team in January, shortly after receiving his postgraduate diploma in data analytics from the University of Mumbai.

Rebelo fell violently ill after watching a disturbing video of a girl being sexually abused by an older man. When Boom alerted the police, the man, the girl’s stepfather, had just been arrested. Now Boom staffers append trigger warnings to graphic videos or images before sending them to colleagues for review.

Rebelo says she meditates to “stay sane,” and orders her team to unplug between 11 p.m. and 6 a.m. “Do things you love, cultivate a hobby, play a sport, move, spend time with friends and family, read books, get more sleep,” the 32-year-old deputy editor tells her fact-checkers, most in their 20s. For the time being, that’s about all they can do-that and the work, cleaning those grains of sand.



Please enter your comment!
Please enter your name here