Instagram becomes hub for paedophiles to buy, commission underage sexual content: Report
Instagram has become a gateway to underage-sex content, with the platform being used to connect and promote a wide variety of such content, investigations by The Wall Street Journal (WSJ) and researchers at Stanford University and the University of Massachusetts Amherst have found.
The popular social media site owned by Meta Platforms is used to commission and purchase underage-sex content, said a report by The Wall Street Journal.
The report said that Instagram doesn't merely host these activities, but its algorithms also promote them.
"Instagram connects paedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests," the Journal and the academic researchers found.
Researchers found that Instagram enabled people to search explicit hashtags and connected them to accounts that used the terms to advertise child-sex material for sale, the WSJ report said.
Such accounts often claim to be run by the children themselves and use overtly sexual handles.
Instagram accounts offering to sell illicit sex material generally don't publish it openly, instead posting "menus" of content, the report said.
Certain accounts invite buyers to commission specific acts.
Some menus include prices for videos of children harming themselves and "imagery of the minor performing sexual acts with animals," researchers at the Stanford Internet Observatory found.
Even meet-ups with children can be bought for the right price, the report said.
The promotion of underage-sex content violates rules established by Meta as well as federal law.
Meta has acknowledged problems within its enforcement operations and said it has set up an internal task force to address the issues raised. "Child exploitation is a horrific crime," the company said, adding, "We're continuously investigating ways to actively defend against this behaviour."
Meta said it has in the past two years taken down 27 paedophile networks and is planning more removals. Since receiving queries from the WSJ, the platform said it had blocked thousands of hashtags that sexualise children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse.
Alex Stamos, the head of the Stanford Internet Observatory and Meta's chief security officer until 2018, told the WSJ that getting even obvious abuse under control would likely take a sustained effort.
"That a team of three academics with limited access could find such a huge network should set off alarms at Meta," he said, noting that the company has far more effective tools to map its paedophile network than outsiders do. "I hope the company reinvests in human investigators," he added.
Using different methods, researchers were able to quickly identify large-scale communities promoting criminal sex abuse.
Test accounts set up by researchers that viewed a single account in the network were immediately hit with "suggested for you" recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites.
"Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children," the report said.
"Instagram is an on-ramp to places on the internet where there's more explicit child sexual abuse," Brian Levine, director of the UMass Rescue Lab, which researches online child victimization and builds forensic tools to combat it, told the WSJ.
Instagram, estimated to have more than 1.3 billion users, is especially popular with teens. The Stanford researchers found some similar sexually exploitative activity on other, smaller social platforms, but said they found that the problem on Instagram is particularly severe.