SOCIAL media sites will be forced to stop self-harm or suicide content being shown to kids under a new law.

It comes after the death of tragic Molly Russell, 14. Legislation to be unveiled on Monday will put a duty of care on tech firms, including Instagram and its owner Facebook.

A watchdog will be able to stop firms from allowing young users to see dangerous images — or prosecute sites that fail.

It comes as a Sun investigation found thousands of self-harm images are still on Instagram, angering Molly’s dad Ian.

She killed herself in 2017 after viewing horrific posts. In February, Instagram boss Adam Mosseri vowed to remove all self-harm images.

But our probe found pictures and videos of cutting and mutilation are still readily found.

Disturbingly, anyone who views content on one account can use hashtags or codewords to view more in seconds.

It was only after we viewed three graphic accounts that Instagram sent an alert offering support — but we could quickly close it and view dozens more.

TRAGIC TEEN
Molly’s dad said: “While I accept it will take time, it is extremely disappointing that harmful content is still so widely available.”

Instagram said it allowed pics of scars if they are used to admit a problem rather than glamorise it.

A spokesman said: “We do not allow content that encourages or promotes suicide or self-harm.”