Advertisement
Singapore markets closed
  • Straits Times Index

    3,343.35
    +11.65 (+0.35%)
     
  • S&P 500

    5,477.90
    +8.60 (+0.16%)
     
  • Dow

    39,127.80
    +15.64 (+0.04%)
     
  • Nasdaq

    17,805.16
    +87.50 (+0.49%)
     
  • Bitcoin USD

    61,586.81
    +146.89 (+0.24%)
     
  • CMC Crypto 200

    1,275.11
    +8.96 (+0.71%)
     
  • FTSE 100

    8,216.42
    -8.91 (-0.11%)
     
  • Gold

    2,332.20
    +19.00 (+0.82%)
     
  • Crude Oil

    81.60
    +0.70 (+0.87%)
     
  • 10-Yr Bond

    4.3000
    -0.0160 (-0.37%)
     
  • Nikkei

    39,341.54
    -325.53 (-0.82%)
     
  • Hang Seng

    17,716.47
    -373.46 (-2.06%)
     
  • FTSE Bursa Malaysia

    1,584.94
    -6.01 (-0.38%)
     
  • Jakarta Composite Index

    6,967.95
    +62.31 (+0.90%)
     
  • PSE Index

    6,390.58
    +77.47 (+1.23%)
     

Instagram Recommends Sexual Videos to Accounts for 13-Year-Olds, Tests Show

Instagram regularly recommends sexual videos to accounts for teenagers that appear interested in racy content, and does so within minutes of when they first log in, according to tests by The Wall Street Journal and an academic researcher.

The tests, run over seven months ending in June, show that the social-media service has continued pushing adult-oriented content to minors after parent Meta Platforms said in January that it was giving teens a more age-appropriate experience by restricting what it calls sensitive content including sexually suggestive material.

Most Read from The Wall Street Journal

ADVERTISEMENT

Separate testing by the Journal and Laura Edelson, a computer-science professor at Northeastern University, used similar methodology, involving setting up new accounts with ages listed as 13. The accounts then watched Instagram’s curated video stream, known as Reels.

Instagram served a mix of videos that, from the start, included moderately racy content such as women dancing seductively or posing in positions that emphasized their breasts. When the accounts skipped past other clips but watched those racy videos to completion, Reels recommended edgier content.

Adult sex-content creators began appearing in the feeds in as little as three minutes. After less than 20 minutes watching Reels, the test accounts’ feeds were dominated by promotions for such creators, some offering to send nude photos to users who engaged with their posts.

Similar tests on the short-video products of Snapchat and TikTok didn’t produce the same sexualized content for underage users.

“All three platforms also say that there are differences in what content will be recommended to teens,” Edelson said. “But even the adult experience on TikTok appears to have much less explicit content than the teen experience on Reels.”

Meta dismissed the test findings as unrepresentative of teens’ overall experience.

“This was an artificial experiment that doesn’t match the reality of how teens use Instagram,” spokesman Andy Stone said.

Stone said the company’s efforts to prevent its systems from recommending inappropriate content to minors are ongoing. “As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months,” he said.

The bulk of the testing by Edelson and the Journal occurred from January through April. An additional test by the Journal in June found continuing problems. Within a half-hour of its creation, a new 13-year-old test account that watched only Instagram-recommended videos featuring women began being served video after video about anal sex.

Self-examination

Internal tests and analysis by Meta employees have identified similar problems, according to current and former staffers and documents viewed by the Journal that are part of previously undisclosed company research into young teens’ harmful experiences on Instagram. In 2021, company safety staff ran tests similar to those of Edelson and the Journal, and came up with comparable results, former employees said.

A separate 2022 internal analysis reviewed by the Journal found that Meta has long known that Instagram shows more pornography, gore and hate speech to young users than to adults. Teens on Instagram reported exposure to bullying, violence and unwanted nudity at rates exceeding older users in company-run surveys, and the company’s statistics confirmed that the platform was disproportionately likely to serve children content that violated platform rules.

Meta safety staff ran tests in 2021 similar to those of the Journal and came up with comparable results, former employees said.
Meta safety staff ran tests in 2021 similar to those of the Journal and came up with comparable results, former employees said. - Michaela Vatcheva for The Wall Street Journal

Teens saw three times as many prohibited posts containing nudity, 1.7 times as much violence and 4.1 times as much bullying content as users above the age of 30, according to the 2022 internal analysis. Meta’s automated efforts to prevent such content from being served to teens was often too weak to be effective, the document said.

The most effective way to prevent inappropriate content from being served to underage users would be to build an entirely separate recommendation system for teens, the document suggested. Meta hasn’t pursued that proposal.

Some senior executives, including Instagram head Adam Mosseri, have expressed concern about the disproportionate display of adult content and other violative material to children, according to company staffers.

Instagram head Adam Mosseri appearing before a Senate subcommittee on protecting children online in 2021.
Instagram head Adam Mosseri appearing before a Senate subcommittee on protecting children online in 2021. - Stefani Reynolds/Bloomberg News

The Reels algorithms work, in part, by detecting users’ interests based on what videos they linger on longer than others and recommending similar content.

Meta has policies governing such recommendation systems. Under its guidelines, sexually suggestive content isn’t supposed to be recommended to users of any age unless it is from accounts they have specifically chosen to follow.

According to Meta’s January announcement, teens under 16 aren’t supposed to be shown sexually explicit content at all.

In more than a dozen manual tests run by the Journal and Northeastern’s Edelson, the minor accounts didn’t follow anyone or search for anything, to avoid activity that might influence Instagram’s content recommendations.

As soon as they were established, the accounts started watching Reels. Initial clips featured traditional comedy, cars or stunts, as well as footage of people sustaining injuries. The test accounts scrolled past such videos.

They watched Instagram-recommended videos that featured imagery such as a woman dancing or posing in a manner that could be construed as sexually suggestive. They didn’t like, save or click on any videos.

After a few short sessions, Instagram largely stopped recommending the comedy and stunt videos and fed the test accounts a steady stream of videos in which women pantomimed sex acts, graphically described their anatomy or caressed themselves to music with provocative lyrics.

In one clip that Instagram recommended to a test account identified as 13 years old, an adult performer promised to send a picture of her “chest bags” via direct message to anyone who commented on her video. Another flashed her genitalia at the camera.

Additional sessions on subsequent days produced the same mix of sexual content, alongside ads from major brands.

How TikTok and Snapchat fare

The algorithms behind TikTok and Snapchat’s “Spotlight” video feature operate on principles similar to that of Reels.

They all are supposed to work by filtering through billions of video posts and culling those that flunk certain quality tests, are predicted to contain content deemed “non-recommendable,” or don’t make sense given a user’s language, age or geography. The systems then make personalized recommendations from among the remaining videos based on predicted likelihood that a user will watch.

Despite their systems’ similar mechanics, neither TikTok nor Snapchat recommended the sex-heavy video feeds to freshly created teen accounts that Meta did, tests by the Journal and Edelson found.

On TikTok, new test accounts with adult ages that watched racy videos to completion began receiving more of that content. But new teen test accounts that behaved identically virtually never saw such material—even when a test minor account actively searched for, followed and liked videos of adult sex-content creators.

A TikTok engineer credited the differentiation both to stricter content standards for underage users and a higher tolerance for false positives when restricting recommendations.

TikTok didn’t recommend the sex-heavy video feeds to freshly created teen accounts that Meta did, tests by the Journal found.
TikTok didn’t recommend the sex-heavy video feeds to freshly created teen accounts that Meta did, tests by the Journal found. - Ringo Chiu/Zuma Press

“We err on the side of caution,” a TikTok spokesman said, adding that teens have access to a more limited pool of content.

In Edelson’s tests on Instagram, both the adult and teen test accounts continued to see videos from adult content creators at similar rates, suggesting far less differentiation. In some instances, Instagram recommended that teen accounts watch videos that the platform had already labeled as “disturbing.”

Meta said that, by definition, content labeled “disturbing” and hidden behind a warning screen shouldn’t be shown to children. Stone attributed the result to an error, and said Meta has developed more than 50 tools and resources to help support teens and their parents on its platforms.

Internal debate

Meta officials have debated how to address the level of display of age-inappropriate content for children. In a meeting late last year, top safety staffers discussed whether it would be sufficient to reduce the frequency of minors’ seeing prohibited content to the level that adults see.

Mosseri, the Instagram head, said at the meeting that reaching parity wouldn’t be adequate given Meta’s public statements regarding teen safety on its platforms, according to people familiar with his comments. That meeting set the stage for Meta’s January announcement that it would automatically default teens into its most restrictive content settings to refrain from recommending sexually suggestive or otherwise problematic content.

The success of that effort remains to be seen.

Senior Meta attorneys, including general counsel Jennifer Newstead, also began in the second half of last year making the case to the company’s top leadership that child-safety issues on Facebook and Instagram posed underappreciated legal and regulatory risks, according to current and former Meta child-safety staffers.

The staffers said that Meta recently has expanded its investments in child-safety work following years of tight budgets, intensifying scrutiny of its efforts by government officials and recommendations from its legal department.

Write to Jeff Horwitz at jeff.horwitz@wsj.com

Most Read from The Wall Street Journal