Advertisement
Singapore markets open in 2 hours 44 minutes
  • Straits Times Index

    3,144.76
    -38.85 (-1.22%)
     
  • S&P 500

    5,051.41
    -10.41 (-0.21%)
     
  • Dow

    37,798.97
    +63.86 (+0.17%)
     
  • Nasdaq

    15,865.25
    -19.77 (-0.12%)
     
  • Bitcoin USD

    63,886.17
    +872.37 (+1.38%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,820.36
    -145.17 (-1.82%)
     
  • Gold

    2,400.80
    -7.00 (-0.29%)
     
  • Crude Oil

    85.36
    0.00 (0.00%)
     
  • 10-Yr Bond

    4.6590
    +0.0310 (+0.67%)
     
  • Nikkei

    38,471.20
    -761.60 (-1.94%)
     
  • Hang Seng

    16,248.97
    -351.49 (-2.12%)
     
  • FTSE Bursa Malaysia

    1,535.00
    -7.53 (-0.49%)
     
  • Jakarta Composite Index

    7,164.81
    -7,286.88 (-50.42%)
     
  • PSE Index

    6,404.97
    -157.46 (-2.40%)
     
Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Facebook's misinformation and violence problems are worse in India

Facebook says its corrective efforts are a work in progress.

MANJUNATH KIRAN/AFP via Getty Images

Facebook whistleblower Frances Haugen's disclosures suggest its problems with extremism are particularly dire in some areas. Documents Haugen provided to the The New York Times, The Wall Street Journal and other outlets suggest Facebook is aware it fostered severe misinformation and violence in India. The social network apparently didn't have nearly enough resources to deal with the spread of harmful material in the populous country, and didn't respond with enough action when tensions flared.

A case study from early 2021 indicated that much of the harmful content from groups like Rashtriya Swayamsevak Sangh and Bajrang Dal wasn't flagged on Facebook or WhatsApp due to the lack of technical know-how needed to spot content written in Bengali and Hindi. At the same time, Facebook reportedly declined to mark the RSS for removal due to "political sensitivities," and Bajrang Dal (linked to Prime Minister Modi's party) hadn't been touched despite an internal Facebook call to take down its material. The company had a white list for politicians exempt from fact-checking.

Facebook was struggling to fight hate speech as recently as five months ago, according to the internal data. And like an earlier test in the US, the research showed just how quickly Facebook's recommendation engine suggested toxic content. A dummy account following Facebook's recommendations for three weeks was subjected to a "near constant barrage" of divisive nationalism, misinformation and violence.

ADVERTISEMENT

As with earlier scoops, Facebook said the documents didn't tell the whole story. Spokesman Andy Stone argued the data was incomplete and didn't account for third-party fact checkers used heavily outside the US. He added that Facebook had invested heavily in hate speech detection technology in languages like Bengali and Hindi, and that the company was continuing to improve that tech.

The company followed this by posting a lengthier defense of its practices. It argued that it had an "industry-leading process" for reviewing and prioritizing countries with a high risk of violence every six months. It noted that teams considered long-term issues and history alongside current events and dependence on its apps. The company added it was engaging with local communities, improving technology and continuously "refining" policies.

The response didn't directly address some of the concerns, however. India is Facebook's largest individual market, with 340 million people using its services, but 87 percent of Facebook's misinformation budget is focused on the US. Even with third-party fact checkers at work, that suggests India isn't getting a proportionate amount of attention. Facebook also didn't follow up on worries it was tip-toeing around certain people and groups beyond a previous statement that it enforced its policies without consideration for position or association. In other words, it's not clear Facebook's problems with misinformation and violence will improve in the near future.

Update 10/25: This story was modified after publishing to give more background on the origin and nature of the internal Facebook documents.