Advertisement
Singapore markets closed
  • Straits Times Index

    3,144.76
    -38.85 (-1.22%)
     
  • S&P 500

    5,061.82
    -61.59 (-1.20%)
     
  • Dow

    37,735.11
    -248.13 (-0.65%)
     
  • Nasdaq

    15,885.02
    -290.08 (-1.79%)
     
  • Bitcoin USD

    63,153.46
    -3,450.12 (-5.18%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,839.77
    -125.76 (-1.58%)
     
  • Gold

    2,388.30
    +5.30 (+0.22%)
     
  • Crude Oil

    85.11
    -0.30 (-0.35%)
     
  • 10-Yr Bond

    4.6280
    0.0000 (0.00%)
     
  • Nikkei

    38,471.20
    -761.60 (-1.94%)
     
  • Hang Seng

    16,248.97
    -351.49 (-2.12%)
     
  • FTSE Bursa Malaysia

    1,535.00
    -7.53 (-0.49%)
     
  • Jakarta Composite Index

    7,164.81
    -122.07 (-1.68%)
     
  • PSE Index

    6,404.97
    -157.46 (-2.40%)
     

Facebook executive says tech firms need stronger regulation

<span>Photograph: Chris Delmas/AFP/Getty Images</span>
Photograph: Chris Delmas/AFP/Getty Images

The tech industry “needs regulation” because it should not be left to make the rules on issues including harmful online content on its own, a Facebook executive has said.

Monika Bickert, Facebook’s vice-president of content policy, believes that “government regulation can establish standards all companies should meet”.

Her comments come as tech companies and some of their staunchest critics head to parliament this week to talk about new rules to deal with harmful content online.

Among those who will testify before MPs and peers is Frances Haugen, a former project manager at Facebook who leaked tens of thousands of internal documents.

ADVERTISEMENT

Related: Facebook missed weeks of warning signs over Capitol attack, documents suggest

The documents include allegations that the social media juggernaut knew its products were damaging teenagers’ mental health and were instigating ethnic violence in countries such as Ethiopia.

They also allege that Facebook employees repeatedly flagged concerns before and after the election, when Trump tried to falsely overturn Joe Biden’s victory. According to the New York Times, a company data scientist told co-workers a week after the election that 10% of all US views of political content were of posts that falsely claimed the vote was fraudulent.

Writing in the Sunday Telegraph, Bickert stated: “While there will no doubt be differing views, we should all agree on one thing: the tech industry needs regulation.

“At Facebook we’ve advocated for democratic governments to set new rules for the internet on areas like harmful content, privacy, data, and elections, because we believe that businesses like ours should not be making these decisions on our own.

“The UK is one of the countries leading the way with wide-ranging proposals on everything from hate speech to child safety and, while we won’t agree with all the details, we’re pleased the online safety bill is moving forward.”

The culture secretary, Nadine Dorries, has said that online hate has “poisoned public life” and the government had been spurred to re-examine its upcoming online safety bill in the light of the death of MP Sir David Amess in his constituency.

Dorries has said that Amess’s death may not have been stopped by a crackdown on online abuse, but it had highlighted the threats faced by people in the public eye.

Calls have been made for social media companies to hand over the data more quickly and rapidly remove the content themselves. The bill should also force platforms to stop amplifying hateful content via their algorithms.

Related: Facebook crisis grows as new whistleblower and leaked documents emerge

Bickert wrote in the newspaper that “once parliament passes the online safety bill, Ofcom will ensure all tech companies are held to account”.

She suggested that “companies should also be judged on how their rules are enforced”.

Facebook has been publishing figures on how it deals with harmful content, including how much of it is seen and taken down, for the past three years. The firm is also independently audited.

Bickert wrote: “I spent more than a decade as a criminal prosecutor in the US before joining Facebook, and for the past nine years I’ve helped our company develop its rules on what is and isn’t allowed on our platforms.

“These policies seek to protect people from harm while also protecting freedom of expression.

“Our team includes former prosecutors, law enforcement officers, counter-terrorism specialists, teachers and child safety advocates, and we work with hundreds of independent experts around the world to help us get the balance right.

“While people often disagree about exactly where to draw the line, government regulation can establish standards all companies should meet.”

She said that Facebook has a commercial incentive to remove harmful content from its sites because “people don’t want to see it when they use our apps and advertisers don’t want their ads next to it”.

The amount of hate speech seen on Facebook has been found to be about five views per 10,000, as detection of it has improved.

Bickert stated: “Of the hate speech we removed, we found 97% before anyone reported it to us – up from just 23% a few years ago. While we have further to go, the enforcement reports show that we are making progress.”

Earlier this week, a report found that an international pressure group that spread false and conspiratorial claims about Covid-19 more than doubled the average number of interactions it got on Facebook in the first six months of 2021.

Pages owned by the World Doctors Alliance, a group of current and former medical professionals and academics from seven countries, received 617,000 interactions in June 2021, up from 255,000 in January, the Institute for Strategic Dialogue found.

The World Doctors Alliance includes prominent members who have falsely claimed that Covid-19 is a hoax and vaccines cause widespread harm.