Other measures being explored include levy on social media companies to help meet costs of online policing

Ministers in the UK are considering creating an internet ombudsman to deal with complaints about hate crimes and are pressing ahead with proposals for a levy on social media companies to help pay for the policing of online offences.

The ideas are being examined by the Department for Digital, Culture, Media and Sport (DCMS) before the release of an autumn green paper, which may be more radical than expected.

An internet ombudsman would deal with complaints about illegal online material, such as abuse and violent threats made on social media, while a levy, first proposed in the Conservative party’s election manifesto, would raise funds to help with the rising costs of online enforcement.

The initiative follows a pledge by Alison Saunders, the director of public prosecutions, to crack down on illegal abuse and threats spread via the internet. She said hate crimes committed online should be treated on a par with threats made face to face.

The DCMS internet safety strategy could be published as early as September. Final recommendations are being drafted by Matt Hancock, the minister of state for digital and culture.

The idea of an internet ombudsman is already being developed in France and Australia. Each country is creating an agency to act as an independent body that will mediate between members of the public and social media firms.

In January the children’s commissioner for England, Anne Longfield, recommended establishing a children’s digital ombudsman to handle requests from teenagers for social media companies to remove illegal posts. Longfield’s report, Growing Up Digital, is one of the documents being carefully considered by DCMS officials, although it appears ministers want to extend the approach to all age groups.

A DCMS spokesperson said: “We are determined to make Britain the safest place in the world to be online, and to help people protect children from the risks they might face. Later this year we will publish the government’s internet safety strategy, and a levy on social media companies is one of a series of measures that we are considering as part of our work.” The spokesperson confirmed that the department was looking at setting up an internet ombudsman.

Yvette Cooper, a Labour MP and chair of the home affairs select committee, said: “Social media companies are still far too slow to act on dangerous extremism, illegal content, threats or intimidation online, and it is affecting all our lives. These are the biggest and richest companies in the world – they should be able to get their act together, but they are still failing. So government needs to get on with this.

“The home affairs select committee called in February for social media companies to be fined if they fail to act and to make financial contributions out of their massive profits towards the police who are having to deal with the online extremism, crime and threats that social media companies are giving a platform to. This is too important to ignore.”

John Carr, secretary of the Children’s Charities’ Coalition on Internet Safety and a member of the executive board of the UK Council for Child Internet Safety, said the funding of an internet ombudsman agency could be modelled on the way the advertising industry pays a levy to support the Advertising Standards Authority. “These companies are like public utilities,” he added. “The idea that they should be outside the scope of the law is unacceptable.”

The notion of forcing highly profitable internet firms to fund under-resourced police forces to carry out online investigations has been gaining wide support. In May, a critical report on hate crime by the home affairs select committee noted there was a precedent for businesses paying for policing.

The report said: “We note that football teams are obliged to pay for policing in their stadiums and immediate surrounding areas under Section 25 of the Police Act 1996. We believe that the government should now consult on adopting similar principles online – for example, requiring social media companies to contribute to the Metropolitan police’s [counter-terrorism internet referral unit] for the costs of enforcement activities which should rightfully be carried out by the companies themselves.”

Assistant chief constable Mark Hamilton, who leads on hate crime for the National Police Chiefs’ Council, said: “We have no tolerance for hate crime, whether it happens on or offline. Victims of hate crime can expect a strong response from the police service and prosecutors. We would welcome any assistance from social media companies that can provide more support to victims and help bring offenders to justice, so we are open to continuing these discussions.”

There is, however, opposition to the idea of industry funding. Myles Jackman, legal director of the Open Rights Group, which monitors internet freedom, said: “Although internet companies can make significant profits, this is not a reason to make them pay to have their customers’ behaviour policed. Should a pub landlord pay the police if two regulars have a fight outside his pub?”

Technology firms have long come under fire for not doing enough to stop the spread of hate speech, but the events of the white supremacist march that took place in Charlottesville, Virginia, on 12 August appear to have galvanised them into action. A wave of crackdowns against white supremacists and neo-Nazis online reflects Silicon Valley’s rapidly changing mindset on how far it is willing to go to police hate speech on its own platforms.

Twitter recently said it was taking action against 10 times the number of abusive accounts a day compared with a year ago, while Google’s YouTube has turned to artificial intelligence in an effort to remove extremist content more quickly, doubling the rate of removal compared with human-only systems.

The world’s largest social network, Facebook, has also taken action against hate speech, removing groups, pages and actively silencing links that lead to hate sites.

Twitter declined to comment on the UK government’s green paper. Facebook and Google did not respond to requests for comment.