FACEBOOK are "morally bankrupt liars" who refuse to take responsibility for the harm they cause people around the globe.

That's according to one top Kiwi politician, who among other things has accused the social media giant of enabling genocide.

New Zealand Privacy Commissioner John Edwards made his pointed attack in a furious Twitter tirade on Sunday.

"Facebook cannot be trusted," Edwards wrote.

"They are morally bankrupt pathological liars who enable genocide (Myanmar), facilitate foreign undermining of democratic institutions."

The long-time Facebook critic pointed to recent controversies involving the firm's platforms, including the live streaming of the recent mass shooting at a mosque in New Zealand.

"[They] allow the live streaming of suicides, rapes, and murders, continue to host and publish the mosque attack video," Edwards said in a followup tweet.

"They allow advertisers to target 'Jew haters' and other hateful market segments, and refuse to accept any responsibility for any content or harm. They #DontGiveAZuck."

He later deleted the controversial tweets, saying they had prompted "toxic and misinformed traffic".

Edwards was responding to a recent interview with Facebook boss Mark Zuckerberg.

Slippery Zuck told America’s ABC network that adding a delay to Facebook Live broadcasts to help clear up harmful streams would "break" the service.

The billionaire added that incidents like the Christchurch attack live stream were the result of "bad actors" not bad technology and a delay would ruin the fun for people who broadcast events like birthday parties.

Edwards scoffed at Zuck's claims.

In a later interview with Radio New Zealand on Monday, he said this "greater good" argument was "disingenuous" because "he [Zuckerberg] can't tell us - or won't tell us, how many suicides are livestreamed, how many murders, how many sexual assaults.

"I've asked Facebook exactly that last week and they simply don't have those figures or won't give them to us."

It follows the news this morning that Facebook and other social media firms could soon face hefty fines if they fail to protect their users from harmful content.

Newly proposed UK online safety laws lay out how companies could be legally required to wipe images of terrorism, child abuse, rape, self-harm and suicide from their sites.

Easy access to damaging material, particularly among young people, has caused growing concern worldwide. It came into the spotlight in Britain after the death of 14-year-old schoolgirl Molly Russell, which her parents said came after she had viewed online material on depression and suicide.

Governments across the world are wrestling over how to better control content on social media platforms, often blamed for encouraging abuse, the spread of online pornography, and for influencing or manipulating voters.