SOCIAL media firms could soon face hefty fines if they fail to protect their users from harmful content.

New UK online safety laws proposed today lay out how companies could be legally required to wipe images of terrorism, child abuse, rape, self-harm and suicide from their sites.

Easy access to damaging material, particularly among young people, has caused growing concern worldwide. It came into the spotlight in Britain after the death of 14-year-old schoolgirl Molly Russell, which her parents said came after she had viewed online material on depression and suicide.

Governments across the world are wrestling over how to better control content on social media platforms, often blamed for encouraging abuse, the spread of online pornography, and for influencing or manipulating voters.

Global worries were recently stoked by the live streaming of the mass shooting at a mosque in New Zealand on one of Facebook's platforms, after which Australia said it would fine social media and web hosting companies and imprison executives if violent content is not removed "expeditiously".

In a policy paper widely trailed in British media, the government said it would look into possibly using fines, blocking access to websites, and imposing liability on senior tech company management for failing to limit the distribution of harmful content.

It would also set up a regulator to police the rules.

TechUK, an industry trade group, said the paper was a significant step forward, but one which needed to be firmed up during its 12-week consultation. It said some aspects of the government's approach were too vague.

"It is vital that the new framework is effective, proportionate and predictable," techUK said in a statement, adding not all concerns could be addressed through regulation.

Facebook said it was looking forward to working with the government to ensure new regulations were effective, repeating its founder Mark Zuckerberg's line that regulations were needed to have a standard approach across platforms.

Rebecca Stimson, Facebook's head of UK public policy, said any new rules should strike a balance between protecting society and supporting innovation and free speech.

"These are complex issues to get right and we look forward to working with the government and parliament to ensure new regulations are effective," Stimson said in a statement.

Prime Minister Theresa May said that while the Internet could be brilliant at connecting people, it had not done enough to protect users, especially children and young people.

"That is not good enough, and it is time to do things differently," May said in a statement.

"We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe."

The duty of care would make companies take more responsibility for the safety of users and tackle harm caused by content or activity on their services. The regulator, funded by industry in the medium term, will set clear safety standards.

A committee of lawmakers has also demanded more is done to make political advertising and campaigning on social media more transparent.

"It is vital that our electoral law is brought up to date as soon as possible, so that social media users know who is contacting them with political messages and why," said Conservative Damian Collins, who chairs the parliamentary committee for digital, culture, media and sport.

"Should there be an early election, then emergency legislation should be introduced to achieve this."

Today's publication of the long-awaited Online Harms White Paper coincides with the Culture Secretary's revelations about a recent meeting with Facebook boss Mark Zuckerberg.

Writing in today’s Sun, Mr Wright explains how he told Mr Zuckerberg at a meeting at Facebook’s officers in San Francisco in February that Britain had “reached a turning point” over its failure to act.

He told the billionaire to his face that the UK government would become the first in the world to introduce a legal crackdown on Facebook’s harmful content.