Instagram’s unmentionable problem
By Mark Hurst • December 6, 2023

In this week’s Techtonic episode, “What tech is hiding,” I covered recent instances of tech companies being caught doing something that they wanted to keep hidden. For example, in a story many people have heard by now, Sports Illustrated was found to be cooking up fake journalists – with fake names, fake bios, and fake headshots – in order to pretend that the AI-generated content posted under their byline was actual sports journalism. As Futurism (which broke the story) pointed out, S.I. was hardly alone: CNET, Bankrate, Gizmodo, BuzzFeed, and USA Today have all been caught trying to slip AI glop onto their sites. As shameful as the practice is, it is pretty widespread in online media and is likely to increase.

You can listen to the show, and explore the episode links, below:

Listen to the “what tech is hiding” episode

See playlist with episode links

Download the podcast

• More Techtonic episodes at techtonic.fm

But I want to reveal more than the embarrassing AI gaffes of online media companies. In the category of “things the tech companies really don’t want to talk about,” there’s one ongoing news story that is especially disturbing. It’s a problem at Facebook and Instagram that their parent company, Meta, would undoubtedly prefer to remain unmentioned.

Meta Is Struggling to Boot Pedophiles Off Facebook and Instagram, write Jeff Horwitz and Katherine Blunt in the Wall Street Journal (Dec 1, 2023). The platforms’ “algorithms continue to promote problematic content” on Facebook and Instagram by connecting “a web of accounts devoted to the creation, purchasing and trading of underage-sex content.”

According to research by the Canadian Centre for Child Protection, and the Wall Street Journal’s own team:

[A] network of Instagram accounts with as many as 10 million followers each has continued to livestream videos of child sex abuse months after it was reported to the company. Facebook’s algorithms have helped build large Facebook Groups devoted to trading child sexual abuse content.

The Facebook algorithm, which recommends groups for people to join, was in the news a couple of years ago when it encouraged users to join racist hate groups. Now this. The Journal article describes a public Facebook group “celebrating incest.” I won’t repeat the details – they’re sickening, and you can read them in the article. What’s pertinent here is Facebook’s response to complaints about groups like this:

When a Journal research account flagged many such groups via user reports, the company often declared them to be acceptable. “We’ve taken a look and found that the group doesn’t go against our Community Standards,” Facebook replied to a report about a large Facebook group named “Incest.”

Instagram, too, works to connect pedophilic accounts. One researcher found a network of Instagram accounts “livestreaming videos of child sex abuse” and reported the activity to Instagram months ago. The result:

more than five months after the network was reported to Meta, accounts affiliated with the network continue to regularly broadcast a mixture of adult pornography, child sexual abuse and bestiality videos, according to separate research by the Canadian Centre for Child Protection.

“We often wonder, ‘Why is it that we can, within minutes, find these massive networks?’” said Leanna McDonald, the center’s president. “Why is no one over there dealing with this?”

None of this is new to Mark Zuckerberg and his team. There’s the story (mentioned above) from December 1, and before that was a story about pedophilia on Instagram on November 27, and even before that was a story on the exact same problem way back on June 7. Leadership at Meta has had no lack of alerts about what their algorithms are promoting. But apart from taking down some individual accounts, the problems continue unabated on Facebook and Instagram. To echo Leanna McDonald: Why?

The answer, of course, is money. Near the end of the Journal article is this sentence:

Meta employs outside contractors to help moderate content.

In a way, it’s the same problem faced by the content companies above. Algorithms drive costs down, while humans drive costs up. A team of trained, full-time content moderators could knock out the pedophilia problem in weeks or even days, but it would cost Zuckerberg some precious margin. The stock price might take a dip. Better to lay off child-safety specialists (which Zuckerberg did) and outsource the problem to contractors in Mumbai (which he also did).

Spokespeople at Meta have issued statements saying the company is working on a fix at the “highest levels” – much like Zuckerberg has pledged, how many times?, that he wants his company to do better. Meantime the problem continues, and Meta keeps its margins up.

As I said on the show on Monday, I honestly don’t understand how Meta is allowed to stay in business. Compare it with Juul, a company that marketed addictive vape sticks to teenagers: you can read my Juul article roundup and see how the company was all but sued out of existence. Compare that to what’s happening at Meta: Instagram creates an addictive product marketed to teens, and they knowingly (and illegally) allow kids under 13 onto the service – see Natasha Singer’s NYT story from Nov 25 – and the company allows networks of pedophilic accounts to continue operating – and all this on top of Facebook’s history of allowing human trafficking.

I wish the internet had turned out better than this. But this is the internet we have, and the first step toward a fix is to put this “unmentionable” problem into the spotlight. Listen to the Techtonic episode. Share this column. Spread the word. Mark Zuckerberg has allowed the worst possible use of the internet to proliferate on his services, all in the name of protecting a stock price. It has to stop.

Until next time,

-mark

Mark Hurst, founder, Creative Good – see our services or join as a member
Email: mark@creativegood.com
Listen to my podcast/radio show: techtonic.fm
Subscribe to my email newsletter
Sign up for my to-do list with privacy built in, Good Todo
On Mastodon: @markhurst@mastodon.social

- – -