Advertisement
Singapore markets close in 4 hours 18 minutes
  • Straits Times Index

    3,173.10
    +1.17 (+0.04%)
     
  • Nikkei

    39,772.99
    +32.59 (+0.08%)
     
  • Hang Seng

    16,550.90
    -186.20 (-1.11%)
     
  • FTSE 100

    7,722.55
    -4.87 (-0.06%)
     
  • Bitcoin USD

    65,393.07
    -2,881.56 (-4.22%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • S&P 500

    5,149.42
    +32.33 (+0.63%)
     
  • Dow

    38,790.43
    +75.66 (+0.20%)
     
  • Nasdaq

    16,103.45
    +130.25 (+0.82%)
     
  • Gold

    2,163.90
    -0.40 (-0.02%)
     
  • Crude Oil

    82.57
    -0.15 (-0.18%)
     
  • 10-Yr Bond

    4.3400
    -4.3400 (-100.00%)
     
  • FTSE Bursa Malaysia

    1,549.77
    -3.87 (-0.25%)
     
  • Jakarta Composite Index

    7,349.30
    +46.85 (+0.64%)
     
  • PSE Index

    6,894.40
    +41.11 (+0.60%)
     

Facebook says it is taking down more material about ISIS, al-Qaeda

FILE PHOTO: The Facebook application is seen on a phone screen August 3, 2017. REUTERS/Thomas White/File Photo

By David Ingram

SAN FRANCISCO (Reuters) - Facebook Inc said on Monday that it removed or put a warning label on 1.9 million pieces of extremist content related to ISIS or al-Qaeda in the first three months of the year, or about double the amount from the previous quarter.

Facebook, the world's largest social media network, also published its internal definition of "terrorism" for the first time, as part of an effort to be more open about internal company operations.

The European Union has been putting pressure on Facebook and its tech industry competitors to remove extremist content more rapidly or face legislation forcing them to do so, and the sector has increased efforts to demonstrate progress.

ADVERTISEMENT

Of the 1.9 million pieces of extremist content, the "vast majority" was removed and a small portion received a warning label because it was shared for informational or counter-extremist purposes, Facebook said in a post on a corporate blog https://newsroom.fb.com/news/2018/04/keeping-terrorists-off-facebook.

Facebook uses automated software such as image matching to detect some extremist material. The median time required for takedowns was less than one minute in the first quarter of the year, the company said.

Facebook, which bans terrorists from its network, has not previously said what its definition encompasses.

The company said it defines terrorism as: "Any non-governmental organization that engages in premeditated acts of violence against persons or property to intimidate a civilian population, government, or international organization in order to achieve a political, religious, or ideological aim."

The definition is "agnostic to ideology," the company said, including such varied groups as religious extremists, white supremacists and militant environmentalists.

(Reporting by David Ingram; Editing by Leslie Adler)