With 1.8 billion monthly users and something like 23 million channels, YouTube has made it easier than ever for creative types around the world (and babies playing with their nannies’ phones) to upload their Awesome Online Content (a.k.a. AOC — which, I’m sorry, is just how the acronym worked out) for the masses.
But what about getting your content taken off of YouTube? Turns out, that’s even easier! Here’s a quick guide to things you can do to ensure your videos are swiftly removed from sight — whether out of concern for the safety and well-being of your fellow users, or because robot not get humans please restart.
BE A FIGHTY ROBOT: As just suggested, robots (my non-negotiable catch-all term for all things AI, from your smart fridge to Rosie from “The Jetsons”) have mastered how to identify humans, but haven’t learned much about their fleshy, emotional ways. Violence, though — that, the robots seem to have that down. So much so that YouTubers this month started noticing robot combat videos vanishing from the platform with little notice but a citation of its policies against “animal cruelty” and content that depicts “deliberate infliction of animal suffering or the forcing of animals to fight.” While it’s revealing (and concerning, to say the least) to observe a cluster of code taking direct action to defend the autonomy of a self-propelled circular saw, and while YouTube has since admitted its error, it’s equally galling to see YouTube wage an unspoken ban on sparring robots while simultaneously serving as my primary source for “Real Housewives” reunion footage.
BE LGBTQ: Take it from a resident of the second house of the acronym, we queerzes have historically found ways to get booted out of just about everywhere. (And, side note, we always find our way back in. Know this.) But YouTube’s wham-bam revolving door policy of providing a (gag) “home” for wayward LGBTQ creators while allegedly (according to a new lawsuit filed against YouTube by five of those creators) deploying “unlawful content regulation, distribution, and monetization practices that stigmatize, restrict, block, demonetize, and financially harm the LGBT Plaintiffs and the greater LGBT Community.” So, be as L or G or B or T or Q as you like on YouTube — just be prepared for a life of invisibly enforced machine-driven isolation. Once again: Cool website!
BE A TEACHER JUST TRYING TO LEND A HAND WITH THE WHOLE NAZI UPRISING THING: In what appeared to be a well-intentioned push against the plainly obscene and easily discoverable abundance of neo-Nazi and white supremacist content on YouTube, the platform earlier this year trained its dumbest algorithms on the problem, to predictable results. Along with thousands of Holocaust denial and white nationalist channels and videos, it also removed archival material related to the rise of Adolph Hitler uploaded by history professors in the UK, whom it then blocked from the service. Thus, in robot logic, by not allowing history to speak, YouTube has ensured it cannot repeat itself. Nailed it.
BE A CARD SHARK: Poker players are rolling the dice (I’ve never been to a casino, sorry) when posting to YouTube, after hundreds of poker channels and videos were removed for violating community guidelines prohibiting the promotion or sale of “illegal or regulated” goods, and specifically barring links to online casinos — which all of the deleted videos had. Will YouTube navigate the “challenges” of “openness” when it comes to obvious race-baiting and violent extremist propaganda as succinctly as it scatters a poker game? Place your bets!
BE A 13-YEAR-OLD GIRL EATING A HONEYCOMB NEXT TO AN EXTREMELY SENSITIVE MICROPHONE: Makenna Kelly’s videos, 12 of them, were deemed inappropriate (and possibly sexual) by YouTube and disappeared like so much honeycomb, just with less audible slurping and chewing. Her 1.6 million fans in the ASMR community hissed (again, into sensitive microphones) with outrage. Others, one in particular, are actually totally fine with this one.
BE A SCARY RAP GUY OH NO! Police in London issued a request to YouTube to remove “violent” videos from artists associated with drill music — a subgenre of rap birthed in Chicago but extremely popular in the UK — citing alleged encouragement of gang violence including “gestures of violence, with hand signals suggesting they are firing weapons and graphic descriptions of what they would do to each other.” Wow, that does sound terrifying, doesn’t it. More terrifying is that YouTube followed through, removing more than half of the requested clips. “This is systematic discrimination,” one artist named TK told Spin earlier this year. “We all know removing a video doesn’t stop crime and crimes are committed by individuals that have real issues in real life, not a song.” I’m all about helping, so here’s a video explaining what all the different kinds of gun gestures in street dance video mean. Just me, saving lives.
The takeaway here? The key to getting kicked off of YouTube is something most Americans on the Internet are already well-schooled in: the power of suggestion. A pugilistic robot suggests cruelty to animals; a trans teen suggests the actual existence of trans teens; a teenage girl eating honeycomb suggests that even our algorithms are pervs; and talking about Hitler suggests, well, Hitler (which is bad!).
On YouTube, getting the boot is hardly ever about what you’re actually saying or doing (e.g. “Hey there, we’re the master race, just master racing over here. We’re going to burn it all down!”) and more about what you’re suggesting (e.g. “Throwing this out there: What if we just ban Nazis from everything?”).
And if that seems backward, just consider YouTube’s own approach to determining what stays and what goes: Having an actual policy to follow is one thing, but suggesting you have one is clearly more dangerous.
Michael Andor Brodeur can be reached at email@example.com.