Rechercher dans ce blog

Thursday, June 8, 2023

Instagram Algorithms Connect ‘Vast’ Network of Pedophiles Seeking Child Pornography, According to Researchers - Variety

Instagram’s recommendation algorithms have enabled a “vast” network of pedophiles seeking illegal underage sexual content and activity, according to a Wall Street Journal exposé.

In a 2,800-word article published Wednesday, the Journal said it conducted an investigation into child pornography on Meta-owned Instagram in collaboration with researchers at Stanford and the University of Massachusetts Amherst.

“Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them,” the Journal reported. “Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.”

A Meta spokesperson said the company is “continuously exploring ways to actively defend against this behavior, and we set up an internal task force to investigate these claims and immediately address them.” Meta acknowledged that the company in some cases received reports of child sexual abuse and failed to act on them, citing a software error that prevented them from being processed (which Meta said has since been fixed). In addition, “we provided updated guidance to our content reviewers to more easily identify and remove predatory accounts,” the Meta rep said.

“Child exploitation is a horrific crime. We work aggressively to fight it on and off our platforms, and to support law enforcement in its efforts to arrest and prosecute the criminals behind it,” the rep said in a statement. “Predators constantly change their tactics in their pursuit to harm children, and that’s why we have strict policies and technology to prevent them from finding or interacting with teens on our apps and hire specialist teams who focus on understanding their evolving behaviors so we can eliminate abusive networks.”

Between 2020 and 2022, according to Meta, its policy enforcement teams “dismantled 27 abusive networks” and in January 2023 disabled more than 490,000 accounts for violating child-safety policies. As of the fourth quarter of 2022, Meta’s technology removed more than 34 million pieces of child sexual exploitation content from Facebook and Instagram, more than 98% of which was detected before it was reported by users, the company said.

According to the Journal’s report, “Technical and legal hurdles make determining the full scale of the [pedophile] network [on Instagram] hard for anyone outside Meta to measure precisely.” The article cited the Stanford Internet Observatory research team’s identification of 405 sellers of what the researchers deemed “self-generated” child-sex material (accounts purportedly run by children themselves) using hashtags associated with underage sex. The WSJ story also cited data compiled via network mapping software Maltego that found 112 of those accounts collectively had 22,000 unique followers.

The Journal’s report noted that Instagram accounts that offer to sell illicit sex material “generally don’t publish it openly” and that such accounts often link to “off-platform content trading sites.”

Researchers found that Instagram enabled people to search “explicit hashtags such as #pedowhore and #preteensex” and then connected them to accounts that used the terms to advertise child-sex material for sale, according to the report. Per the Journal, test accounts set up by researchers that viewed a single such account “were immediately hit with ‘suggested for you’ recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites. Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children.”

In addition, certain Instagram accounts “invite buyers to commission specific acts,” with some “menus” listing prices for videos of children harming themselves or “imagery of the minor performing sexual acts with animals,” according to the Journal report, citing the findings of Stanford Internet Observatory researchers.

“At the right price, children are available for in-person ‘meet ups,’” the Journal reported.

Among other internet platforms, Snapchat and TikTok do not appear to facilitate networks of pedophiles seeking child-abuse content in the way that Instagram has, according to the Journal report. On Twitter, the Stanford Internet Observatory team identified 128 accounts offering to sell child-sex-abuse content; according to the researchers, Twitter didn’t recommend such accounts to the same degree as Instagram and Twitter also removed such accounts “far more quickly,” the Journal reported. (An emailed request for comment to Twitter’s press account resulted in an autoreply with a poop emoji.)

Adblock test (Why?)



"network" - Google News
June 07, 2023 at 09:08PM
https://ift.tt/JQXZzI5

Instagram Algorithms Connect ‘Vast’ Network of Pedophiles Seeking Child Pornography, According to Researchers - Variety
"network" - Google News
https://ift.tt/N2QIeqP
Shoes Man Tutorial
Pos News Update
Meme Update
Korean Entertainment News
Japan News Update

No comments:

Post a Comment

Search

Featured Post

Comcast reluctantly agrees to stop its misleading “10G Network” claims - Ars Technica

Enlarge Comcast Comcast has reluctantly agreed to discontinue its "Xfinity 10G Network" brand name after losing an appeal of...

Postingan Populer