Instagram Algorithms Connect Vast Network of Pedophiles Seeking Child Pornography, According to Researchers

Instagram’s recommendation algorithms have enabled a “vast” network of pedophiles seeking illegal underage sexual content and activity, according to a Wall Street Journal exposé.

In a 2,800-word article published Wednesday, the Journal said it conducted an investigation into child pornography on Instagram in collaboration with researchers at Stanford and the University of Massachusetts Amherst.

“Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them,” the Journal reported. “Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”

A Meta spokesperson said the company is “continuously exploring ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them.” Meta acknowledged that the company had in some cases received reports of child sexual abuse and failed to act on them, citing a software error that prevented them from being processed (which Meta said has since been fixed). In addition, “we provided updated guidance to our content reviewers to more easily identify and remove predatory accounts,” the Meta rep said.

“Child exploitation is a horrific crime. We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it,” the rep said in a statement. “Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps, and hire specialist teams who focus on understanding their evolving behaviors so we can eliminate abusive networks.”

Between 2020 and 2022, according to Meta, its policy enforcement teams “dismantled 27 abusive networks” and in January 2023 disabled more than 490,000 accounts for violating child-safety policies. As of Q4 2022, Meta’s technology removed more than 34 million pieces of child sexual exploitation content from Facebook and Instagram, more than 98% of which was detected before it was reported by users.

According to the Journal’s report, “Technical and legal hurdles make determining the full scale of the [pedophile] network hard for anyone outside Meta to measure precisely.” The article cited the Stanford Internet Observatory research team’s identification of 405 sellers of what they deemed “self-generated” child-sex material (accounts purportedly run by children themselves) using hashtags associated with underage sex. The WSJ story also cited data compiled via network mapping software Maltego that found 112 of those accounts collectively had 22,000 unique followers.

Read More About:

Source: Read Full Article