According to YouTube official, the huge volume of content regularly uploaded to the video streaming service makes it impossible for the company to filter out all terror-related material.

Google, the parent company of the service, confirmed that around 300 hours of video content is being uploaded to the site every minute. The site employees explained that “to pre-screen those videos before they are uploaded is like screening a phone call before it’s made”.

Of course, the streaming service has very clear policies in place that forbid hate speech, incitement to violence and graphic violence. However, when it comes to nudity and other types of material, there are occasions when YouTube makes exceptions because of the context of the video. For instance, many media outlets sometimes show portions of such inappropriate videos, and they are allowed to stay on the site because they carry the appropriate news, documentary context around them.

In addition, the service has mechanisms in place to allow its own users to flag inappropriate videos and had even moved to introduce a specific “promotes terrorism” flag. If any of the videos on the site is flagged, the company’s enforcement team reviews it, 24/7. YouTube said that a human always reviews every video, and they are not removed automatically.

However, the service doesn’t actively pre-screen for content. Such activity wouldn’t allow the portal to be that flourishing platform, so instead they rely on their community to flag violations. In other words, YouTube is receiving assistance in filtering the abusive content.

When law enforcement authorities tell Google about infringing content, 93% of it is taken down. But when YouTube users flag up problems, only 1/3 of it gets removed. Regardless of the serious threat posed by terrorist-related videos, neither the online giants nor the EU would want to start a legal battle to enforce their removal. The EU had to admit that although they can contemplate legislation, it would most likely be an “awfully monumental exercise”.

YouTube reiterated its point of view responding to the concerns, saying that the website has clear policies that forbid such material as gratuitous violence, hate speech and incitement to commit violent acts. YouTube does remove videos violating these policies when flagged by its users, but they can’t pre-screen all of it.