Advertisement
Singapore markets closed
  • Straits Times Index

    3,332.80
    -10.55 (-0.32%)
     
  • Nikkei

    39,583.08
    +241.54 (+0.61%)
     
  • Hang Seng

    17,718.61
    +2.14 (+0.01%)
     
  • FTSE 100

    8,164.12
    -15.56 (-0.19%)
     
  • Bitcoin USD

    61,514.39
    +590.42 (+0.97%)
     
  • CMC Crypto 200

    1,276.04
    -7.78 (-0.61%)
     
  • S&P 500

    5,460.48
    -22.39 (-0.41%)
     
  • Dow

    39,118.86
    -45.20 (-0.12%)
     
  • Nasdaq

    17,732.60
    -126.08 (-0.71%)
     
  • Gold

    2,336.90
    +0.30 (+0.01%)
     
  • Crude Oil

    81.46
    -0.28 (-0.34%)
     
  • 10-Yr Bond

    4.3430
    +0.0550 (+1.28%)
     
  • FTSE Bursa Malaysia

    1,590.09
    +5.15 (+0.32%)
     
  • Jakarta Composite Index

    7,063.58
    +95.63 (+1.37%)
     
  • PSE Index

    6,411.91
    +21.33 (+0.33%)
     

Swifties stepped in to bury explicit Taylor Swift AI images as tech companies struggled to respond

Kathryn Riley—Getty Images

Good morning, Broadsheet readers! The Supreme Court will hear arguments about abortion pill mifepristone in late March, MacKenzie Scott sells a quarter of her Amazon shares, and Taylor Swift AI deepfakes are a warning sign. Have a thoughtful Tuesday.

- Scary sign. Last week, AI-generated sexually explicit images of Taylor Swift went viral. One was reportedly seen as many as 45 million times before X banned most searches for "Taylor Swift" on the platform over the weekend.

The ordeal was a scary sign of the technological risks that lie ahead for women—and how ill-prepared tech companies are to address them.

Deepfake porn, in which AI generates nonconsensual nude or pornographic photos and videos of real people, is a growing problem that undermines the consent and bodily autonomy of its victims. Victims aren't just celebrities; high school boys created and circulated nonconsensual imagery of female classmates in October. Recent advances in consumer AI platforms have made the longstanding problem even more prevalent.

ADVERTISEMENT

According to the news outlet 404 Media, 4chan users made nonconsensual celebrity nude images, including the ones of Swift, using the Microsoft-owned AI text-to-image platform Designer. While Microsoft's terms of service prohibit nonconsensual nude images, users relied on loopholes, including one that allowed those images to be generated when celebrities' names were slightly misspelled. Microsoft has since eliminated the loophole, 404 Media reported.

Microsoft CEO Satya Nadella told NBC in an interview on Friday that it is the company's responsibility to "move fast" on fixing the "alarming" images and the issues they raise. The subject even reached the White House, where press secretary Karine Jean-Pierre said in a briefing on Friday that social media companies must enforce their own rules to prevent the spread of deepfakes. The Biden administration "know[s] that lax enforcement disproportionately impacts women...and girls, who are the overwhelming targets of online harassment," she said.

The solutions companies have presented so far, however, are far from perfect. X, which took 19 hours to suspend the accounts that posted the images, eventually blocked searches for Swift's name. That kind of fix penalizes the victim, suppressing their genuine work and profile as platforms scramble to quell the spread of deepfakes.

Swift's massive fan base helped bury the AI images. Swifties reported them en masse and overwhelmed search terms with normal photos of the pop star, resources that the average victim—whether a lesser-known celebrity or a normal, non-famous person—doesn't have.

That this happened to one of the world's most powerful women is terrifying; where does that leave the rest of us? At the same time, the awful Swift images finally got leaders in business and government—those with the power to prevent this from happening to other women—to pay attention.

Emma Hinchliffe
emma.hinchliffe@fortune.com
@_emmahinchliffe

The Broadsheet is Fortune's newsletter for and about the world's most powerful women. Today's edition was curated by Joseph Abrams. Subscribe here.

This story was originally featured on Fortune.com