Navigate to News section

Why Is YouTube Punishing People Who Translate and Expose Anti-Semitism on Its Platform?

For years, YouTube has failed to distinguish between those who upload racist material and those who post and translate such content to raise awareness of bigotry and inform the fight against it.

by
Yair Rosenberg
March 03, 2017
Shutterstock
Shutterstock
Shutterstock
Shutterstock

This past month, Tablet published a piece by journalist Eylon Aslan-Levy about a new cartoon music video that had just been released by the terrorist group Hamas. Written in Hebrew in an effort to intimidate Israelis, the animated song featured explicit exhortations to violence against Jewish civilians, and included such iconic images as an ultra-Orthodox Jew “having his head blown off and stuck on a pike, as well as another being shot in the head through crosshairs.” As part of his report, Aslan-Levy posted the video on YouTube with English subtitles, so that non-Hebrew readers could understand Hamas’s threats in it. Two days later, YouTube took down his translated video for “hate speech” and warned him that further offenses could result in the suspension of his account. Essentially, the service was unable to distinguish between journalism that aimed to expose violent incitement and bigotry, and the real thing. Last week, YouTube denied Aslan-Levy’s appeal, in which he explained he was a reporter acting to inform the public.

This decision was all the more curious given that YouTube continues to allow many other Hamas music videos—posted by their sympathizers and garnering hundreds of thousands of views—to remain on the site unmolested. But while the site’s conduct in this instance may seem strange, it was not an isolated incident. For years, YouTube has been taking down videos that translate and expose anti-Semitism and punishing those who post them. Perhaps the most notable victim of this censorship is MEMRI, the Middle East Media Research Institute.

MEMRI translates television and media from across the Middle East, highlighting both bigotry and those activists who seek to fight it. Its work, though sometimes controversial like all things dealing with that fractious region, has earned it bipartisan accolades. As legendary U.S. diplomat Richard Holbrooke, who served in key roles in the Clinton and Obama Administrations, put it: “MEMRI allows an audience far beyond the Arabic-speaking world to observe the wide variety of Arab voices speaking through the media, schoolbooks, and pulpits to their own people. What one hears is often astonishing, sometimes frightening, and always important. Most importantly, it includes the newly-emerging liberal voices of reform and hope, as well as disturbing echoes of ancient hatreds. Without the valuable research of MEMRI, the non-Arabic speaking world would not have this indispensable window.”

MEMRI’s translated clips have been featured everywhere from CNN to FOX to MSNBC to The New York Times. The organization has been briefing lawmakers on Capitol Hill for over a decade, and more recently, its clips have also been used in hate crimes trials across Europe.

One of the most famous showcases of MEMRI’s work came during the 2014 Gaza War, when CNN’s Wolf Blitzer confronted official Hamas spokesman Osama Hamdan with a MEMRI-translated clip of Hamdan himself claiming that “the Jews used to slaughter Christians in order to mix their blood in their holy [Passover] matzos.” After being shown the video, Hamdan still refused to denounce the blood libel and instead offered the immortal defense that he had Jewish friends.

But if you search YouTube for MEMRI’s original video of Hamdan, as opposed to the CNN segment in which part of it appeared, you won’t find it. That’s because, like many of MEMRI’s videos, this one was taken down.

“Our YouTube site has been brought down probably four times over the past decade,” said MEMRI CEO Steve Stalinsky. Individual videos have also been removed. Some of this has been due to copyright claims on MEMRI’s translations by outlets like Al Jazeera that don’t want the general public to be aware of the bigotry that they have broadcast in Arabic. But other instances have been due to alleged hate speech violations, much like in Aslan-Levy’s case. This had led MEMRI to be reticent about posting certain videos to YouTube, as opposed to platforms like Facebook, and diminished their ability to reach viewers. And of course, each time their YouTube channel goes down, MEMRI loses all its subscribers and videos.

Like Tablet, Stalinsky and MEMRI have had little success getting YouTube to tackle this issue. “It’s impossible to get a live person there,” he said, “so we put in a series of complaints and different requests to have it reviewed, and we had a lawyer send a letter to their office, and we never heard back.”

How does this all happen? YouTube’s system for identifying problematic content relies on users flagging and reporting allegedly objectionable videos. Site staff then sort through the reports and remove content at their discretion. Unfortunately, this process is easily gamed by those bigots and their sympathizers who don’t want MEMRI’s material out there. After a barrage of such misleading reports, for example, YouTube notoriously took down MEMRI’s channel during the height of the Gaza war—when its clips were appearing all over the mainstream media—then apologized and reinstated them after inquiries by journalists like then-POLITICO media reporter Dylan Byers.

But years later, as Aslan-Levy’s and MEMRI’s subsequent experiences demonstrate, YouTube has still not taken substantive steps to remedy this problem. Instead, it continues to rely on a deeply flawed ad hoc approach. The site appears to have no subject-area experts consulting with them on the suspension process who are able to determine the difference between researchers highlighting hatred in order to combat it, and those bigots seeking to spread it. And because YouTube relies essentially on users to flag content, copious amounts of unreported anti-Semitic and other bigoted material remain readily available on the site.

When asked to respond to the instances outlined above, a YouTube spokesperson provided the following statement:

YouTube has clear policies that outline what content is acceptable to post. Our teams review flagged content 24 hours a day against these policies. In determining whether controversial content may be removed, we look at whether or not they contain sufficient news, educational or documentary context, including the video’s metadata and description.

Needless to say, this reiteration of YouTube’s current process merely restates the problem.

At a time when major web platforms like Twitter and Facebook have come under increasing scrutiny for the way their platforms fail to deal adequately with hate and abuse, YouTube deserves similar scrutiny for the way its policies have repeatedly thwarted, however unintentionally, the fight against that very same scourge.

Yair Rosenberg is a senior writer at Tablet. Subscribe to his newsletter, listen to his music, and follow him on Twitter and Facebook.