How many bones is YouTube’s management going to toss to the public reassuring us it is cleaning up its platform of extremist content? How many more exorbitantly-paid lobbyists is it going to hire to peddle how much better it is doing to clean up its act? Despite endless mea culpas and infinite pledges to become a better corporate citizen, neo-Nazi anti-Semitic incitement is all too prevalent all over YouTube.

On January 23, 2019, the Counter Extremism Project (CEP) spotted on Google’s YouTube platform vulgar videos from the neo-Nazi group Atomwaffen Division, which would have remained there but for CEP’s vigilant policing—a job that YouTube can and should be doing. YouTube assured Congress that it had invested in the necessary tech software to quickly zap domestic extremist hate, but the evidence points otherwise. 

As CEP discovered with good old fashioned online investigative work, atomwaffen has been enjoying YouTube’s hospitality under a user named “chanz” Before YouTube deleted Atomwaffen’s neo-Nazi content following a red flare from CEP, chanz remained uncensored on YouTube for two months, uploading 15 videos that generated 7,000 views—which may not seem like much in the grand scheme of things, but all it takes is one neo-Nazi video to inspire and incite another act of domestic anti-Semitic terror. Unfortunately, the nation knows all too well that the Pittsburgh Tree of Life synagogue terrorist binged on neo-Nazi social media content before his rampage. Atomwaffen’s white supremacy propaganda represents an undebatable violation of YouTube’s own terms of customer service, but there it was.

CEP’s executive director, David Ibsen stated when revealing CEP’s findings: “It defies belief that YouTube continues to outsource responsibility for enforcement of its own terms of service…yet fails time and again to discover extremist content on its own, or via its own retained consultant flaggers whose job is not to let such content get through the cracks.”

A couple of days ago, CEP uncoveredeven more neo-Nazi content on YouTube: the infamous neo-Nazi tome—Siege. A video currently available on YouTube spotlights Seige in audiobook form. Seige is a white-supremacist neo-Nazi diatribe written by American neo-Nazi James Mason that serves as a Nazi propaganda manifesto for Atomwaffen. On the YouTube audiobook Mason advocates for the creation of a violent, leaderless neo-Nazi guerilla movement and sustained lone act terrorism to bring down the U.S. government.

To show how important it is to white supremacists recruiting, Atomwaffen requires all of its members to read Seige. 

According to CEP’s David Ibsen Seige is yet another blatant violation of the YouTube’s terms of service: “…it is stunning that YouTube’s content filters are able to identify Seige as inappropriate or offensive, yet YouTube is unwilling or unable to remove it despite it clearly violating YouTube’s policies against hate speech.” 

Just yesterday, without much effort at all I uncovered another neo-Nazi gem on YouTube. I inserted the neo-Nazi created acronym “zog” (zionist operated governments) into YouTube’s search box, and voila, up pops at the top of the search a video of an on-line video game “zog’s nightmare.”  The online video game is a “shootem’ up” of Nazi soldiers killing Israeli fighters wearing yellow Stars of David with the word “Juden” on them—the same yellow patches Hitler required Jews to wear in concentration camps and in Nazi-occupied ghettos. 

Why is that video on YouTube? Where is YouTube’s vaunted zero tolerance policy against this anti-Semitic claptrap? 

These are just three examples of readily discoverable YouTube content highlighting the proliferation of neo-Nazi inciteful extremist garbage. There is plenty more neo-Nazi hate speech and incitement on YouTube, but space prevents me from including it here.

The recently formed “Network Contagion Research Institute”—composed of scientists who have studied hundreds of millions of digital social media content—uncovered in their research that fringe dark web platforms (such as Gab.ai, 4chan and 8chan) serve as online safe havens for extremist racist and neo-Nazi groups. These three social media fringe platforms knowingly transmit extremist material to YouTube via memes, which cloak neo-Nazi content in crass or humorous words and imagery—sort of like animated poison pills that infiltrate neo-Nazi content onto mainstream social media.  The Washington Post featured the diabolical infiltration route in a September 7, 2018 analysis.

YouTube’s executives are well aware of these sneaky tricks but simply lack the willpower and wherewithal to combat them. One would think that a company whose 2018 annual ad revenue was close to $4 billion would spend more resources to come up with a technology-oriented moonshot initiative to remove this threat to public security.     

in 2018, the Anti-Defamation League reported that U.S. domestic extremists killed at least 50 people, the overwhelming number murdered by right-wing extremists.

Earlier this month, AT&T declared it is resuming advertising on YouTube. AT&T yanked its digital advertising off YouTube in 2017 along with scores of other major U.S. advertisers when CEP and the UK Guardian newspaper revealed that major U.S. corporate ads were appearing alongside videos promoting hate speech, terrorism, and other extremist content.  

According to the New York Times (January 19, 2019) Fiona Carter—AT&T’s chief brand officer—lauded YouTube’s “brand suitability system” for ensuring that zero ads ran alongside content which violates YouTube’s terms of customer service. I guess Ms. Carter considers it acceptable for AT&T to again advertise on YouTube as long as its ads do not appear alongside neo-Nazi content that remains on YouTube.

This week, YouTube announced that it is promising to stop promoting sensationalistic clips that revolve around scientifically proven falsehoods and other conspiracy theories (some of the more notorious relied on by Mr. Trump to perpetuate his views). If YouTube’s techies can spot these conspiracy videos, surely they can also spot neo-Nazi and racist content. But to find it one has to look for it—and that is YouTube’s “see no evil” SOP. 

YouTube’s management is negligent, has been negligent, and will remain negligent until pressure from the U.S. government is exerted on it to clean up its act. Its executives have had ample time since its platform has been pillaged by extremists for almost ten years to prevent it from remaining a safe haven for domestic and foreign extremists and terrorism.  YouTube has a public duty to cease its alibis and live up to its own terms of service to its users.   

So long as the U.S. lacks an effective, coherent strategy to counter the malfeasance of social media platforms, notably Facebook and YouTube, the body count of extremist victims of domestic terror will surely continue to rise, and that would be a travesty of injustice to us all.

About the author: Ambassador Marc Ginsberg is Senior Diplomatic Adviser to the Counter Extremism Project.

The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.