This week, we posted a video of a woman going topless at a rooftop bar for a photography art project, and in less than 24 hours we received an email from YouTube stating: "Your account has now been terminated. Please be aware that you are prohibited from accessing, possessing or creating any other YouTube accounts." There was no way to reply to the "email@example.com" robot that sent this notice, nor was there any warning.
Our account was previously suspended after receiving three strikes from YouTube—the 1st came in 2010, when we uploaded a video of a man going down on a woman on the 5 train. The others (two for the same video, described below) came in February of this year. The suspension lasted a few months, and was mostly just annoying because you cannot get embed codes when your account is suspended. For uploading videos, we simply used other YouTube accounts, or other platforms like Vimeo and Brightcove.
According to YouTube's Community Guidelines on "Sex and Nudity," nudity "is not allowed, particularly if it is in a sexual context." However, exceptions are made for "educational, documentary, scientific, and artistic content, but only if that is the sole purpose of the video and it is not gratuitously graphic."
Axing the 2010 video was understandable—it's essentially graphic amateur porn, even if we were posting it as part of our ongoing editorial commentary on subway etiquette.
But this week's topless woman video was decidedly tasteful, and done as part of a photography project. Even the NYPD officers who appear in the video couldn't find anything wrong with what this woman and the photographer were doing (once they were out on the street, where women are allowed to go topless in NYC). When did the internet get so prudish? Well, it hasn't really...
YouTube has plenty of graphic videos of topless women that they are fully aware are hosted on their site. They come with content warnings that read: "This video may be inappropriate for some users." The user then simply clicks a link and is brought to the uncensored underworld where bare breasts are allowed on your computer screen.
Robin Thicke's uncensored, controversial, and arguably sexist video for "Blurred Lines" is right there, happily hosted by YouTube with over 10 million views and counting. The site did originally ban the video, but it reappeared shortly thereafter. We have found no explanation as to why, but it goes without saying that the highly trafficked video site makes money by delivering eyeballs to advertisers. And Thicke's song is so insanely popular that it continues to break records. People want more more more. Clickity click click.
So boobs are fine, so long as they're Top 40 Hype Williams boobs, or Justin Bieber's ass, or boobs that look an awful lot like the boobs we were trying to upload. But what about utilizing YouTube's platform to draw attention to an injustice?
The aforementioned second and third strikes were delivered by YouTube in February, after we received a video of two little girls being forced to fight in a New York City park by their caretakers. We uploaded the video to YouTube—first the raw version, then another with the faces of the girls blurred, the latter of which we used in our story. But before we could even delete the non-blurred version, both versions were taken down by YouTube with no warning, and our account was suspended. This all happened in a matter of minutes.
After a spirited discussion amongst several Gothamist editors we decided to publish the video, in addition to giving copies of the video to the Manhattan DA's office and the NYPD. If we ran the blurred video, we knew other news outlets would pick it up, and the authorities would be pressured to act. In fact, that is exactly what happened.
YouTube's Community Guidelines regarding children are:
Videos involving children (anyone under the age of 18) are particularly sensitive. Videos containing children should never be sexually suggestive or violent. Please be cautious when posting something involving a child.
The video could be considered violent in that it features two young girls shoving and hitting each other. But once the children's identities are hidden, is there still grounds to remove the video? YouTube hosts a litany of disturbing images but how many lead to the arrests of alleged child abusers?
These are questions worth discussing with YouTube, but until today we've only dealt with robots.
After we Tweeted asking if anyone had a direct contact at YouTube, someone from another news organization sent us one. In an email, we argued our case and wondered why YouTube takes rather reactionary measures, like suspending or deleting an account, without warning or discussion first, particularly with a news organization. We were told that our message would be passed along.