Snapchat and content collapse
By Mark Hurst • June 25, 2020
Snap, the social media company, got in a bit of trouble last week on Juneteenth. In an attempt to join in the national commemoration of the end of American slavery, the Snapchat app launched a new image filter. Even if you don't use Snapchat, you've probably seen its filters, like the one that superimposes dog ears and a snout on the user's face. There are lots of these filters, and occasionally Snapchat launches new ones to coincide with a timely event.
Thus in honor of Juneteenth last week, Snapchat proudly unveiled its new filter, which instructed the user to "SMILE" - after which . . . well, if you haven't seen it, watch this five-second video from journalist Mark Luckie (or this video from Twitter user _BodyBySunny).
As shown in the video, Luckie follows the instruction and smiles, activating a cartoon animation of two metal chains breaking in the background. Little bits of chain go flying through the air.
Get it? Snapchat offers the zany fun of a puppy-dog filter, except applied to America's original sin. The chains of slavery, updated for social media as a bouncy cartoon, and activated by a smile. What could go wrong?
You probably heard. In response to the instant, overwhelming, and completely unsurprising flood of negative feedback, Snap removed the filter, claiming it was due to the simple mistake of the filter not having gone "through our review process" (CNBC). This is frankly hard to believe, given how many people it usually requires in any Big Tech company to sign off before new features go live. But if it's true, Snap looks even worse, since Snapchat has gotten in trouble for racist filters in the past - twice.
Back in 2016, Snapchat launched a blackface filter that "celebrated" Bob Marley. On 4/20. You know, just your normal Big Tech initiative, helping kids celebrate recreational drug use on "Weed Day" by applying blackface. (A large percentage of American teenagers are on Snapchat, per Pew Research.) And then a few months later, Snapchat launched an Asian yellowface filter, which a spokesperson assured everyone was "meant to be playful and never to offend" (Guardian).
In its response to last week's Juneteenth crisis, Snap never mentioned those past mistakes. Just that the culprit was the "review process." Then two days later, Snap's diversity chief reversed that story, claiming that the Snap team had, in fact, reviewed the filter, but that they "reviewed the Lens from the standpoint of Black creative content, made by and for Black people, so did not adequately consider how it would look when used by non-Black members of our community" (Verge). In other words, Snap's mistake was - if I'm reading this right - not considering that White people might use the filter-? Uhh . . . I'd say, again, watch the videos by Mark Luckie and Sunny, and draw your own conclusions.
I can't make sense of the diversity chief's message, but she certainly doesn't touch what I think is the key issue. Namely, what's missing from Snap's response is any discussion of whether it's appropriate for Big Tech companies to comment on slavery with cartoon animations. While I'll grant, as I've written, that I'm not an authority on racism in tech, the Juneteenth filter is a perfect example of an insidious problem with social media, something Nicholas Carr calls "content collapse." Put simply, as Big Tech flattens our world into digital gloss, we lose our connection with reality.
As Carr wrote earlier this year:
By leveling everything, social media also trivializes everything - freed of barriers, information, like water, pools at the lowest possible level. A presidential candidate's policy announcement is given equal weight to a snapshot of your niece's hamster and a video of the latest Kardashian contouring. . . . Content collapse consolidates power over information, and conversation, into the hands of the small number of companies that own the platforms and write the algorithms.
Snap, in this case, is using the Juneteenth filter to "trivialize" a weighty topic, and to consolidate its power as gatekeeper over young Americans' conversations. It's a distraction to talk about who thought up the filter, or approved it (Black or White employees), or which users it was intended for. The outcome here is what matters, and that's what Mark Luckie and Sunny, and countless other users, were responding to.
It's that very outcome that Snap never addressed in its response. Put bluntly, Snapchat filtered American slavery into a five-second animation - showing that Big Tech will trivialize anything, reduce anything, cartoonify anything, if it keeps users engaged with the product. Nothing is sacred, and nothing is protected. Everything, as Nick Carr puts it, will be "fed through a shredder, then thrown into a wind tunnel." Or as L.M. Sacasas puts it in a recent post, our digital media environment "is blind to traditional categories such as credibility or trustworthiness. . . . [It's] indifferent to truth."
And that, in the end, is the threat. It's not that Big Tech is opposed to truth - for example, Snap isn't trying to whitewash history and claim that slavery never happened - it's just, as Sacasas puts it, indifferent to truth. Any meaning, any gravity of a topic disappears - if the phone can just make you smile.
We have to get better at discerning the problem by looking at the outcomes that Big Tech creates. While Snapchat's puppy-dog images seem like harmless fun, we won't like it when our entire world goes through the filter.
- - -