Tech by Blaze Media

© 2024 Blaze Media LLC. All rights reserved.
Blaze News investigates: Is social media really a 'breeding ground for predators' — or are we worrying too much?
Photo by Matt Cardy/Getty Images

Blaze News investigates: Is social media really a 'breeding ground for predators' — or are we worrying too much?

Shocking accusations have been made surrounding Meta’s content directed at minors.

It’s an open question whether social media, which has created the most expansive public square in history, has been a good thing for humanity. On the one hand, it has allowed companies and organizations an easy way to get their messages out, and it has also made it easier than ever for like-minded people to connect and create social communities that span the globe.

The benefit provided to society has definitely been tempered with some negative effects. Scams, hustles, screen addiction, partisan censorship, and other evils have at various times infested the medium. One problem in particular has potentially gotten worse as a result of social media: child exploitation.

Over the last two years, a number of state governments opened investigations into social media companies in response to widespread reports of inappropriate content and even sexual propositions being pushed on children’s accounts.

'Meta has been pushing sexualized content to all their users for some time.'

One of the first recent investigations was conducted by the New Mexico attorney general’s office in late 2023. The investigation attempted to determine how Meta platforms (Facebook and Instagram) fed their content to kids.

Using test profiles that purported to be children (including both teens and preteens), the state office found that their accounts were exposed to pornography and were sought out by online predators as well.

According to reporting by the Verge, an account that was pretending to be a 13-year-old girl somehow managed to attract 6,700 followers on Instagram, most of whom were adult males. About three or four times per week, the account would receive messages that included “pictures and videos of genitalia, including exposed penises.”

The outlet stated that the “dummy” account also reported many of the predator accounts to the offending platform, but Meta determined there were not any violations of its community standards.

As a result of this investigation, the New Mexico attorney general sued Meta, saying “platforms Facebook and Instagram are a breeding ground for predators who target children for human trafficking, the distribution of sexual images, grooming, and solicitation.”

“Teens and preteens can easily register for unrestricted accounts because of a lack of age verification. When they do, Meta directs harmful and inappropriate material at them,” the lawsuit also stated.

A Meta spokesperson responded by suggesting, contrary to the evidence, that the company’s technology and tools were sufficient to find and expose predators on the platform.

“We use sophisticated technology, hire child safety experts, report content to the National Center for Missing and Exploited Children, and share information and tools with other companies and law enforcement, including state attorneys general, to help root out predators,” Meta said.

Just days later, Kathleen Blunt of the Wall Street Journal released a report on her own months-long investigation into similar activities on Meta platforms. She shared her findings on Marketplace Tech and noted how quickly her test accounts were shown explicit content.

Blunt said the speed at which Instagram’s algorithm began recommending sex content related to children as well as adults was “quite surprising.”

Not only that, but she noted that the sexually explicit content was also monetized.

Groups with obviously disturbing names like “Little girls” and “Beautiful boys” were reportedly recommended to Blunt’s dummy accounts by Facebook’s algorithms. There was even a group labelled “Incest” that the researchers found. When the researchers reported the group internally to Meta, the social media giant allegedly said the group didn’t go against community standards.

According to Blunt, Meta claimed that it organized an internal task force to focus on these problems and manually remove “problematic” accounts on a large scale. It once again claimed, as it did in response to the New Mexico investigation, that it had been using technology to try to solve the problem by trying to determine how much a user has been looking at specific groups or engaging with child accounts. Meta also claimed it has also been removing hashtags relating to pedophilia.

Meta’s struggles to control predators, however, have been complicated by an entirely different problem: parents who are profiting off sexual content featuring their own children.

Troublingly, according to February 2024 report from the Wall Street Journal, Meta has been knowingly allowing parents to profit from this despicable practice.

According to the report, certain “parent-managed minor accounts” allegedly sold materials to audiences of adult men that included photos of their children in revealing attire, exclusive chat sessions, and used clothing such as leotards and cheerleading outfits.

The report went on to claim that Meta staff were aware that these parents had sexual conversations about their own children and even sometimes made them interact with sexual messages sent by subscribers.

The New York Times conducted its own investigation into the parent-run accounts, noting that they can earn up to $3,000 for a single post. The report said that branded posts on Instagram get a boost from the platform’s algorithm, which only amplifies the accounts to more predators.

The reports were so damning to the social media giant that Democratic Senator Maggie Hassan sent letters to TikTok, X, and Meta demanding to know whether the platforms monetize young girls’ accounts and whether they were aware of children circumventing the age requirements. Meta responded to the New York Times about the allegations, saying the company “prevent[s] accounts exhibiting potentially suspicious behavior from using our monetization tools, and we plan to limit such accounts from accessing subscription content.”

Perhaps not surprisingly, not many people's fears were assuaged by these reassurances from Meta, and investigations into child predator behavior online continued. It didn’t take long for researchers to find out that, in spite of Meta’s promises, not much had changed.

The Wall Street Journal conducted another investigation to test Meta’s response in a joint effort with Northeastern University professor Laura Edelson. The investigation once again set up new test accounts that pretended to be a child just 13 years old.

Their research found that Instagram began pushing sexual content from adults to the young teenage accounts within the first three to four minutes. It took just another 20 minutes for the algorithm to populate the feed with promotions from explicit adult creators, some of whom sold nude photos.

The outlet reported that similar tests on other platforms like Snapchat and TikTok did not yield the same results for an underage account.

“Even the adult experience on TikTok appears to have much less explicit content than the teen experience on [Instagram],” Edelson reported.

In response, Meta defended itself more strenuously than it had against previous accusations.

“This was an artificial experiment that doesn’t match the reality of how teens use Instagram,” spokesman Andy Stone claimed.

“As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months,” he stated.

Tech policy expert and child safety advocate Rick Lane told Blaze News that entities like Meta, Reddit, and Google have known about these dangers since their companies started, and they’re only getting worse.

Government intervention

In March 2023, the government of Utah banned social media for those under 18 without parental permission, before the majority of these reports surfaced.

The Utah Social Media Regulation act implemented the following restrictions: age verification on social media, a ban on ads for minors, and social media curfew that made the sites off limits for minors after 10:30 p.m. until 6:30 a.m.

Under the legislation, social networks were also required to give parents access to their teens’ accounts, CNN reported.

In March 2024, Florida followed suit when it banned social media for those under 14, making 14- and 15-year-olds acquire parental permission to use the platforms, as well. As the government responds to the ongoing threat, determining the appropriate government response can be a challenge. Many observers believe that it is time to impose absolute age limits on social media in the same way that government does for things like nicotine or alcohol.

“There are always going to be bad actors and predators on these social media sites, but Meta has been pushing sexualized content to all their users for some time,” Return’s Peter Gietl explained.

“Instagram seems to be devolving into soft-core porn for a lot of users. I would absolutely be in favor of a minimum age requirement, not only because of the sexualized content but also because of the extremely deleterious affect it has on the mental health of preteens and teenagers,” he added.

BlazeTV contributor and mother Sara Gonzales staunchly said her children do not have access to social media. She called the dangers of these platforms “obvious” and said that the platforms “absolutely do not” do enough to prevent sexual content from making its way to children.

“The real problem in this country is that parents so willingly hand their young, impressionable children access to such dangerous content for the sake of convenience,” she added.

Lane, however, said that despite the reality of child sexual exploitation online, he doesn’t think banning children from social networks is the solution. At the same time, he called for sites to include age-appropriate features and safety mechanisms from their inception.

So long as children are exploited or groomed online, there will not be an end to calls for action in regard to age limitations.

It is time for a national discussion surrounding a cost/benefit analysis of children being on social media. Given the sheer number of problems it has caused and the reality of generations before them surviving just fine without it, it is hard to imagine any plausible benefit it has for our youth.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?
Andrew Chapados

Andrew Chapados

Andrew Chapados is a writer focusing on sports, culture, entertainment, gaming, and U.S. politics. The podcaster and former radio-broadcaster also served in the Canadian Armed Forces, which he confirms actually does exist.
@andrewsaystv →