Advertisement
Singapore markets open in 3 hours 51 minutes
  • Straits Times Index

    3,293.13
    +20.41 (+0.62%)
     
  • S&P 500

    5,071.63
    +1.08 (+0.02%)
     
  • Dow

    38,460.92
    -42.77 (-0.11%)
     
  • Nasdaq

    15,712.75
    +16.11 (+0.10%)
     
  • Bitcoin USD

    64,119.43
    -2,267.60 (-3.42%)
     
  • CMC Crypto 200

    1,385.35
    -38.75 (-2.72%)
     
  • FTSE 100

    8,040.38
    -4.43 (-0.06%)
     
  • Gold

    2,329.00
    -13.10 (-0.56%)
     
  • Crude Oil

    82.82
    -0.54 (-0.65%)
     
  • 10-Yr Bond

    4.6520
    +0.0540 (+1.17%)
     
  • Nikkei

    38,460.08
    +907.92 (+2.42%)
     
  • Hang Seng

    17,201.27
    +372.34 (+2.21%)
     
  • FTSE Bursa Malaysia

    1,571.48
    +9.84 (+0.63%)
     
  • Jakarta Composite Index

    7,174.53
    -7,110.81 (-49.78%)
     
  • PSE Index

    6,572.75
    +65.95 (+1.01%)
     

Instagram has looked deep into my soul – and I really don’t like what it has found there

<span>Photograph: bombuscreative/Getty Images</span>
Photograph: bombuscreative/Getty Images

You are worried about what social media giants know about you, because of course you are. Data is gold; algorithms, also gold. If you’re not paying for the product, you are the product. One day soon, all this will be robots. The connection between those last three statements isn’t logical: they just all sit in the same bag marked “the future”. So when I discovered the pocket of Instagram where you can find out what it thinks you’re interested in (on the app, you’ll find it under Settings> Security> Access data > Ads), I obviously felt it my duty as a netizen to see what dark insights it had into my private soul.

Here goes: jewellery; luxury goods; electronic music; love; emotions; fashion design; crafts. I mean: no offence, Kraftwerk (and loved ones) but I could not name eight things I am less interested in. Maybe oxbow lakes.

Detailed research has gone into the racist and sexist underpinnings of a lot of predictive software. In software that was used by the US criminal justice system, baked-in racial bias led to faulty predictions of the likelihood of black prisoners reoffending. Image recognition software places women in the kitchen holding mops, men on the sports field doing … I don’t know, whatever men do there. This eventually filtered down to our government’s public health messaging; it produced a Covid “stay at home” poster in which women do literally everything and men sit on sofas, which was swiftly withdrawn. It is possible that was dreamt up without any algorithms. Sexism did exist before the internet, I have to keep reminding myself.

ADVERTISEMENT

I cannot, of course, say for certain that Insta has sexist algorithms. The fact that it has misidentified so many non-interests, yet totally failed to spot the many, many hours on there I spend watching videos of American pit bulls in hats, could be just an oversight. But it does set me to thinking what bias does to tailored marketing: it snarls it up. You can put the full force of your mighty tech brain into mining every imaginable piece of data, only to waste it on a wrong assumption. You’d think some wunderkind somewhere would wake up.