Advertisement

Analysis: Should YouTube censor al-Qaida?

By SHAUN WATERMAN, UPI Homeland and National Security Editor

WASHINGTON, May 20 (UPI) -- Sen. Joseph Lieberman, I-Conn., called Monday for YouTube to take down al-Qaida videos that users had posted, but the site said most of the videos his office had flagged did not contain material that violated their guidelines and rejected his request that they act to remove all material from U.S. designated terror groups.

In a letter to Eric Schmidt, the chairman and chief executive officer of Google Inc., which owns YouTube, Lieberman urged the popular video-sharing service to enforce its own "community guidelines" against "graphic or gratuitous violence" and said the company should change its rules "to address violent extremist material."

Advertisement

In the letter, released to the media and dated Monday, Lieberman wrote that removing videos produced by al-Qaida and other extremist groups should be straightforward, because "so many of the Islamist terrorist organizations brand their material with logos or icons identifying their provenance."

Advertisement

In a blog posting, YouTube said it welcomed the dialogue with Lieberman but noted that "most of the videos" his office had drawn to their attention "did not contain violent or hate speech content" and had therefore not been removed from the site "because they do not violate our Community Guidelines."

YouTube, which hosts millions of videos posted every day by its user-community, also rejected the idea that it should pre-screen content for videos produced by al-Qaida and other terrorist groups.

The blog posting said the site "encourages free speech and defends everyone's right to express unpopular points of view. We believe that YouTube is a richer and more relevant platform for users precisely because it hosts a diverse range of views, and rather than stifle debate we allow our users to view all acceptable content and make up their own minds."

A Lieberman aide said the material he was concerned about went beyond the bounds of acceptable free speech.

"It is not reasonable, let alone legal, for an al-Qaida spokesman to visit the United States and try to recruit and build support here," the aide told UPI. "Why is it reasonable for the same person to do that in this virtual community?"

Advertisement

Al-Qaida and other groups are using YouTube in their efforts to legitimize their violence, spread their ideas and recruit potential terrorists. "Shouldn't the community guidelines ban (U.S.-)designated (foreign terrorist organizations) from using the service?" the aide asked.

The answer Lieberman's critics offer is that he is misunderstanding the nature of the Internet. "There is nothing YouTube or Sen. Lieberman can do to keep these videos off the Internet," John Morris, senior counsel at the Center for Democracy and Technology, told UPI, adding that many of the propaganda videos produced by al-Qaida and other groups "contained nothing that is illegal."

Morris pointed out that removing such material automatically might not be as easy as it seems. "The idea that they would have to review every video (posted), even by a semi-automated process, is not a practical reality."

"If automated means were used to identify material, would a news report or documentary containing the same material … be blocked or removed simply because the logo (of al-Qaida or another terror group) appeared (in it)?" asked Ben Venzke of Intel Center, a private sector contractor that monitors extremist Web communications for clients including the U.S. government.

Advertisement

Venzke said most were not posted on YouTube by the groups themselves, but by individuals who had found them elsewhere, "and may not even support the terrorists' goals."

"The core underlying issue," he said, "is that whether you take these videos off YouTube or not, they will always be available at numerous other locations online. New outlets are popping up constantly, and when you take one down, 10 more simply appear to fill the gap. The problem is the very nature of the Internet itself. It makes controlling and denying access to information simply impossible."

Nonetheless, Lieberman's aide said the senator was just asking the company to enforce its own rules and that many of the videos YouTube declined to take down "are in our opinion clearly in violation of the community guidelines. … YouTube is still not enforcing its own rules."

In his letter, Lieberman quoted the community guidelines: "Graphic or gratuitous violence is not allowed. If your video shows someone getting hurt, attacked, or humiliated, don't post it."

Many of the videos produced by al-Qaida's media arms in Iraq and Afghanistan "show attacks on U.S. forces in which American soldiers are injured and, in some cases, killed," wrote Lieberman.

Advertisement

Even though many roadside bomb attacks against coalition vehicles are shown in the mid-distance, there are plenty of other videos that show more graphic scenes. But Morris, of the Center for Democracy and Technology, noted that "top-selling movies have scenes depicting extreme violence" and that it was appropriate that YouTube should enforce its own guidelines.

The company is "quite insulated from legal liability by the U.S. Criminal Code," and "any effort to legislate (to get videos made by designated foreign terror groups off the Internet) would be struck down on First Amendment grounds."

Lieberman's aide said that Google's responsibility not to post terrorist videos was something the company "has to wrestle with. … They are the dominant player in this (video-sharing) market."

But Venzke echoed the feelings of several intelligence professionals UPI has spoken to about this issue -- that Lieberman is barking up the wrong tree. "Removing material from YouTube would have little to no effect on key terrorist groups' primary dissemination efforts," he told UPI. "There are no major jihadi groups … using YouTube as their primary release point."

Latest Headlines