Instagram teens are regularly recommended sexual and explicit videos, new report finds

A new Wall Street Journal investigations shows that Instagram's recommendation algorithm disproportionately recommends explicit content to minors.

Jun 20, 2024 - 18:15
 0  4
Instagram teens are regularly recommended sexual and explicit videos, new report finds
A person sitting in a dark room on their phone. A pattern showing the Instagram logo is reflected in mirrors below them.

Young Instagram users are more easily recommended sexually explicit and harmful videos than the platform lets on, according to a new report.

The child safety findings are based on two different site experiments conducted by the Wall Street Journal and Northeastern University computer science professor Laura Edelson. Tested over a period of seven months, the publication set up new minor accounts which then scrolled through Instagram's video Reels feed, skipping over "normal" content and lingering on more "racy" adult videos. After only 20 minutes of scrolling, the accounts were flooded with promotions for "adult sex-content creators" and offers of nude photos.

Instagram accounts marked as minors are automatically assigned to the strictest content control limits.

The journal's tests replicate those conducted by former company safety staff in 2021, which found that the site's universal recommendation system was limiting the effectiveness of child safety measures. Internal documents from 2022 show that Meta knew its algorithm was recommending "more pornography, gore, and hate speech to young users than to adults," the Wall Street Journal reports.

"This was an artificial experiment that doesn’t match the reality of how teens use Instagram," Meta spokesperson Andy Stone told the publication. "As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months."

Similar tests were run on video-oriented platforms like TikTok and Snapchat, but they did not yield the same recommendation results.

The new findings follow up a November report that found Instagram's Reels algorithm was recommending sexually explicit to adult users that were only following child accounts.

A February investigation, also by the Wall Street Journal, unveiled that Meta staffers had warned the company about the continued presence of exploitative parents and adult account holders on Instagram, who were finding ways to profit from images of children online. The report noted the rise of "Momfluencers" engaging in sexual banter with followers and selling subscriptions to view suggestive content of their children, such as dancing or modeling in bikinis.

Advocates and regulatory bodies have trained their sights social media's role in online child exploitation. Meta itself has been sued multiple times for its alleged role in child exploitation, including a December lawsuit that accused the company of creating a "marketplace for predators." Following the creation of its child safety task force in 2023, Meta launched a series of new safety tools, including anti-harassment controls and the "strictest" content control settings currently available.

Meanwhile, Meta competitor X recently overhauled its adult content policy, allowing users to post "produced and distributed adult nudity or sexual behavior, provided it's properly labeled and not prominently displayed." The platform has stated that account holders under the age of 18 will be blocked from seeing such content, as long as its labeled with a content warning. But X does not outline any consequences for accounts posting unlabeled adult content.

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow