TikTok is unaware that eight people were killed last Friday night in Houston, Texas, when a crowd of 50,000 erupted during Travis Scott’s Astroworld music event. It has no idea that dozens of people were injured as Scott encouraged the crowd to “rage,” or that countless more were traumatized by the struggle to keep upright in a swarm of people crammed so close together that it was sometimes difficult to breathe properly. None of this is known to TikTok. How could it possibly be otherwise? As much as we like to personify it, it is not a person. It’s a program that you can download.
However, the app is aware that users are sharing Astroworld movies at a quick pace. They’re retweeting and commenting on these videos. People are downloading them and texting or Instagram DMing them to their buddies. Even if you’d never heard of the event before it became one of the country’s deadliest crowd-control failures in recent history, the app is learning—not incorrectly— that people want more Astroworld content, making it increasingly likely that some of those videos will find their way to your For You Page. Because these are the signals that users send to the app: More videos on a given topic are produced when people pay attention to it.
Delivered to your email, the best movies, TV shows, books, music, and more.
As a result, in the days after the incident, I began to see an increasing number of disturbing films made during Scott’s Astroworld concert. The ones from the crowd are, to say the least, anxiety-inducing; it’s a great shock to go from lying in bed, watching some lady make chicken in an air fryer or a twentysomething do a dance I’ll never learn, to hearing people shouting for aid that isn’t coming. TikTok has labeled some of the videos as having “sensitive content.” Many of them come with no notice at all.
In a way, the footage is crucial to the tragedy’s aftermath. It’s first-hand account of what transpired at Astroworld, and it’ll undoubtedly be useful as the inquiry continues. But it’s not enough that I would have asked TikTok to show me if I’d had the chance. (TikTok’s moderation system is a mix of machines and humans, as is the case with most tech platforms.) TikTok’s “technology” identifies videos, which are then manually checked for content infractions by a human moderator. However, if a video gets past the first screen, it will immediately start live.)
With varied degrees of success, I’ve tried to teach my FYP that I don’t want to watch these videos. But the more terrible, or at least differentially harrowing, videos I’m now being offered are those from before the tragedy—TikToks of individuals regretting having to sell their tickets because their plans changed at the last minute or, worse yet, eagerly getting ready to attend the festival. When I see those videos, my first instinct is to go straight to the comments section, expecting to learn whether or not these people survived the show. When I understand that the relief I’m feeling for that individual comes at the human cost of another person’s life, any solace I feel is quickly replaced with a different melancholy.
TikTok, like Instagram, shows videos in an algorithmic rather than chronological order, which means videos aren’t seen in the order they were posted. They emerge when a computer program thinks them intriguing enough for you to see, or when the platform determines that you will want to engage with them as well. When it works properly, you’ll get a feed full of items you’ll love watching, such as videos on topics you’re interested in. Or Instagram photos from real friends, not just strange brands you followed for a giveaway and then forgot to unfollow. The algorithm, however, occasionally falls short. Consider the days after the 2016 presidential election: For the following week, my Instagram feed felt like a strange and extended wake. Optimistic election-day messages from folks heading to the polls didn’t appear on my feed until after the results were announced and four years of Donald Trump as president had become our new reality. Were these posts something I’d have liked before the election? Without a doubt, the algorithm picked up on it. However, it was unable to recognize that the meaning of those posts had completely changed. Five years later, with disasters like Astroworld, something similar is happening again.
This is another another example of how unprepared computer platforms are to deal with real-life human emotions—and why it’s nearly difficult to prevent the internet from sending you advertising for baby formula and tiny booties if you’ve announced a pregnancy that ends in miscarriage. As Lauren Goode of Wired pointed out in April, you’re bound to live in a digital wedding industrial complex for the rest of your life, even if you call off your nuptials. Algorithms aren’t afflicted with grief. They are unable to distinguish between morbid and ordinary interest. They solely do data traffic. However, people—and their lives—are much more than data, even if the algorithms we use only see us as such.