The European Union has given internet giants three months to show that they are removing extremist content more proactively or face legislation that could force them to do so. The European Commission has also issued recommendations for the world’s largest tech companies on removing illegal content.

From child porn to hate speech, the Commission has said that the social media platforms need to do more to combat extremism and remove objectionable content.

Along with other recommendations, the Commission has also asked Google, Facebook, Twitter and other tech firms to be ready to remove extremist content within an hour of being flagged by the law enforcement agencies. The Commission suggested that the terrorism-related content was the most harmful in the first hours of its appearance online and the companies need to act quickly to take such content down.

“Online platforms are becoming people’s main gateway to information, so they have a responsibility to provide a secure environment for their users,” Andrus Ansip, the commission’s digital single market vice president, said in a statement. “What is illegal offline is also illegal online.”

"While several platforms have been removing more illegal content than ever before — showing that self-regulation can work — we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens’ security, safety and fundamental rights."

The recommendations are non-binding but could be considered by the European courts and are designed to set guidelines for tech companies on how they should remove objectionable and illegal content that could range from copyright infringements to hate speech.

European Union will consider legislation after three months if tech firms fail to deal with hate speech

The EC has also warned tech companies that it will consider the need for legislation within three months if the situation doesn’t improve on terrorist content due to the urgency of the problem. Several European governments have suggested that terrorism-related extremist content is resulting in incidents where attackers are killing others after being radicalized. The Commission has given six months for all other types of illegal content.

As for the tech firms, at least Facebook has said that it agrees with the EC’s recommendations. “We share the goal of the European Commission to fight all forms of illegal content. There is no place for hate speech or content that promotes violence or terrorism on Facebook,” the company spokesperson said. “We continue to work hard to remove hate speech and terrorist content while making sure that Facebook remains a platform for all ideas.”