Singapore markets close in 1 hour 21 minutes
  • Straits Times Index

    2,422.84
    -27.84 (-1.14%)
     
  • Nikkei

    22,977.13
    -354.81 (-1.52%)
     
  • Hang Seng

    24,081.82
    -504.78 (-2.05%)
     
  • FTSE 100

    5,581.75
    -1.05 (-0.02%)
     
  • BTC-USD

    13,236.22
    -251.01 (-1.86%)
     
  • CMC Crypto 200

    259.27
    +16.59 (+6.84%)
     
  • S&P 500

    3,310.11
    +39.08 (+1.19%)
     
  • Dow

    26,659.11
    +139.16 (+0.52%)
     
  • Nasdaq

    11,185.59
    +180.72 (+1.64%)
     
  • Gold

    1,869.80
    +1.80 (+0.10%)
     
  • Crude Oil

    35.70
    -0.47 (-1.30%)
     
  • 10-Yr Bond

    0.8350
    0.0000 (0.00%)
     
  • FTSE Bursa Malaysia

    1,465.23
    -29.97 (-2.00%)
     
  • Jakarta Composite Index

    5,128.23
    -15.82 (-0.31%)
     
  • PSE Index

    6,324.00
    +74.61 (+1.19%)
     

Twitter has 'more analysis to do' after algorithm shows possible racial bias

Jon Fingas
·Associate Editor
·1-min read

Twitter is learning first-hand about the challenges of eliminating racial bias in algorithms. The social network’s Liz Kelley said the company had “more analysis” to do after cryptographic engineer Tony Arcieri conducted an experiment suggesting Twitter’s algorithm was biased in prioritizing photos. When attaching photos of Barack Obama and Mitch McConnell to tweets, Twitter seemed to exclusively highlight McConnell’s face — Obama only popped up when Arcieri inverted the colors, making skin color a non-issue.

Others tried reversing photo and name orders to no avail. A higher-contrast smile did work, Intertheory’s Kim Sherrell found. Scientist Matt Blaze, meanwhile, noticed that the priority seemed to vary depending on the official Twitter app used. Tweetdeck was more neutral, for instance.

Kelley said that Twitter had checked for bias before using the current algorithm, but “didn’t find evidence” at the time. She added that Twitter would open source its algorithm studies to help others “review and replicate.”

There’s no guarantee that Twitter can correct this. However, the experiment does show the very real dangers of algorithmic bias regardless of intent. It could shove people out of the limelight, even if they’re central to a social media post or linked news article. You might have to wait a long while before issues like this are exceptionally rare.