A TROUBLED teen carves into her bloodied arm, a suicidal brunette screams into the camera and a girl slashes her sweet smile with a blade.

These are just a few of the sickening posts on Instagram that a Sun Investigation found with just a few clicks.

This week devastated father Ian Russell slammed the social media giant — owned by Facebook — for contributing to the suicide of his 14-year-old daughter Molly.

After showing “no obvious signs” of serious mental health problems, Molly was found dead in her bedroom in November 2017.

After her death her family found out she had been viewing scores of Instagram posts normalising and even romanticising self-harm and suicide.

Mr Russell, of Harrow, North West London, said: “We are very keen to raise awareness of the harmful and disturbing content that is freely available to young people online.

“Not only that, but the social media companies, through their algorithms, expose young people to more and more harmful content, just from one click on one post.”

Even a quick search on Instagram confirms Mr Russell’s fears.

And the language users adopt on the site is designed to fool any worried parent trying to keep an eye on their child’s wellbeing.

Hashtags of girls’ names used on the site might appear innocuous, but are actually abbreviations for severe mental health issues.

TROUBLED TEENS WALLOW IN HATRED OF THEMSELVES
Ana stands for anorexia, Annie means anxiety, Bella means borderline, Sophie means schizophrenia and Sue stands for suicidal.

Sickeningly, thousands of images and videos from children’s cartoons are being used on Instagram to glorify anxiety, depression, self-harm and suicidal behaviour.

There is also a wealth of posts bearing the hashtag #sadsimpsons, for example. Parents who check their children’s search history are unlikely to be worried by this — but the content is terrifying.

In one bleak black and white video, a sad Homer Simpson has a noose around his neck, next to the hashtags #hatemyself and #ritzen” — a German word for self-harm.

In another, Homer is seen plunging from a high-rise building.

And in a sick take on AA Milne’s beloved book characters, Eeyore’s grey body is seen hanged from a tree as Winnie the Pooh and Tigger stare on. The caption reads: “I can’t believe he actually killed himself.”

And while searches for #selfharm #suicide will bring up a warning screen to deter users from viewing images, users can simply press “view images anyway”.

Once in, an onslaught of gory images and videos which resemble scenes from horror movies jump out of the page. Except these are all real, and many have garnered dozens of “likes” and appreciative comments.

There are also memes featuring text with grim, hopeless messages including: “People told me, ‘Just kill yourself’. I’m trying” and “How to kill yourself”.

The British trade body for advertisers, ISBA, has raised concerns about adverts appearing alongside Instagram posts.

Instagram has more than two million advertisers including brands such as H&M, Deliveroo, Nike, Domino’s and Sainsbury’s.

The minimum age to sign up to the site is 13 years old, but this is impossible to enforce. And while Instagram claims to be diligent about removing graphic posts, some of the worst we saw have been up for at least ten days.

The online comments are even more disturbing. Troubled teens wallow in self-hate, calling themselves ugly, fat and unlovable. One girl wrote under a video of a noose: “I wish I could hang myself but I am so fat I can’t even do that.”
Under her comment, a poster  wrote: “Dying is the answer. People making fun of u. I legit have tried and I’m 12. Life is bad.”

Many posts from those in distress draw comments designed to humiliate or upset them even further, in a practice known as ”roasting”.

Now Molly’s family is campaigning for social media sites to review content and make it harder for teens to view damaging content.

The teen’s story has chilling similarities to the case of Milly Tuomey, 11. Before taking her own life in January 2016, the Dublin youngster scrawled “beautiful girls don’t eat” across her body in pen before posting haunting diary entries on Instagram detailing her plan to die.

Milly’s mother Fiona Tuomey, who founded the Healing Untold Grief Group, told The Sun: “Suicide is a complex issue which cannot be attributed to just one factor.

“Social media is an integral part of young people’s communication.

“It’s time governments held these global companies to account. Redirecting people to help sites is simply not good enough.

“The social media giants have the power and technology to stop this.”

The UK has the highest self-harm rate of any country in Europe — and the majority of those affected are aged between 11 and 25.

‘SEEING OTHERS SELF-HARM NORMALISES IT’
Instagram addict Nicole Simone, 21, a barmaid from Dover, Kent, began self-harming at 13 and blames social media for making her mental health issues worse.

She said: “I follow some really dark accounts and have looked at self-harm posts. It just makes my mental state worse and pushes me to want to hurt myself. Seeing other people hurting themselves normalises self-harm.”

Meanwhile in York, psychology student and recovering anorexic Talia Sinnott, 21, said: “Instagram only encouraged my negative thoughts, progressing my illness to the point where I was hospitalised, weighing less than 6st.

“I was naive and clueless and came across ‘pro anorexia’ pages. They taught me how to cheat my parents into thinking I was fine.”

Andy Burrows, NSPCC’s associate head of child safety online, told The Sun yesterday: “We call on the Government to introduce new laws that force social networks to protect children from harmful content and abuse online, and to fine them when they fail.” Last night Instagram launched an investigation into The Sun’s findings — although a spokesman insisted some of the images could be BENEFICIAL to vulnerable users.

She said: “We do not allow content that promotes or glorifies eating disorders, self-harm or suicide and will remove it.

“Mental health is a complex issue and we work closely with experts who advise us on our approach.

“They tell us the sharing of a person’s mental health journey or connecting with others who have battled similar issues can be an important part of recovery.

“This is why we don’t remove certain content and instead offer people looking at, or posting it, support messaging that directs them to groups that can help.”