MOLLY Russell was just 14 when she took her own life in November 2017, encouraged, her father believes, by viewing material linked to self-harm and suicide on social media sites including Facebook’s image-sharing platform Instagram.

Though Molly had shown no signs of mental illness in the run up to her death, Ian Russell said the viewing history his youngest daughter left on the family computer left him in “no doubt that Instagram” had played a part in her death, with around 50 of the accounts she followed seemingly normalising and, by extension, encouraging self-harm. “We went to one [account] Molly was following and what we found was just horrendous,” Mr Russell said in an interview with The Sunday Times. “They seemed to be completely encouraging of self-harm, linking depression to self-harm and to suicide, making it seem inevitable, normal, graphically showing things like cutting, biting, burning, bruising, taking pills. It was there, hiding in plain sight.”

Worse still, when he looked at Molly’s email account he realised that Pinterest - a site usually seen as the place where the middle classes gather images to help them design their homes - had been sending her automated alerts linking through to similarly disturbing material. “In the same way that someone who has shown an interest in a particular sport may be shown more and more posts about that sport, the same can be true of topics such as self-harm or suicide,” Mr Russell said. “This one was sent after she died — there’s a picture of self-harm on it. They were actually emailing her pictures of self-harm, so Pinterest has a huge amount to answer for.”

He is not the only bereaved parent who feels that way, with suicide-prevention charity Papyrus reporting a spike in calls to its UK helpline since Molly’s story was reported last week. The callers, a spokeswoman for the charity said, were “all saying the same thing”, that social media had had a part to play in their children’s deaths.

Given the complex layering of issues that surround any suicide, there can be no absolute certainty that such causality exists. Yet given that young lives are potentially being put at risk, the alternative - assuming there is no causality and taking no action as a result - would seem to be woefully inadequate. The problem is, in the absence of laws setting out what can and cannot be done online, working out what kind of action would be adequate is easier said than done.

As things stand social media companies have their own policies in place for dealing with content that is deemed to be inappropriate, though as there is no uniformity of approach it is easy to see how some posts can slip through the cracks. And, while Facebook executive Steve Hatch said he was “deeply sorry” that social media had been implicated in Molly Russell’s death, it is also clear that the site is loath to intervene in its users’ activity, with Mr Hatch noting that it will not remove potentially distressing content if it appears to have been posted by someone looking for support from an online community. Instagram and Pinterest take a similar approach, although both have committed to reviewing their enforcement policies.

Unsurprisingly Molly Russell’s father would like to see things go much further, and he has called for an independent UK regulator to be set up to ensure distressing material is removed from social media sites within 24 hours of being posted. Though health secretary Matt Hancock - who gained experience of the digital world during a two-year stint as a minster in the Department for Culture, Media and Sport - has made similar calls himself in the past, for now he has simply written to the main social media companies urging them to “purge” harmful content from their sites “once and for all”.

How successful this will be remains to be seen, with the onus at this stage being placed on the internet and social media providers themselves to, as Mr Hancock wrote, “ensure the action is as effective as possible”. They have hardly shown willing in this respect in the past, with Facebook in particular being implicated as an enabler of peddlers of fake news, while the other main sites have come under fire for the apparently arbitrary way in which their rules are applied.

And though Mr Hancock has warned that the Government will not be afraid to “introduce new legislation” if it feels social media firms are flouting the demand that harmful content be removed, the parameters of that threat are so ill-defined it is hardly likely to result in serious action from either side. Meanwhile, without a universally accepted understanding of what harmful content is and what needs to be done to remove it, there will always be a danger that more children like Molly Russell will be able to access it.

Given its relative newness and global reach the internet is often seen as a lawless place, but it doesn’t have to be that way. Dangling the threat of legislation in front of social media firms is one thing, but the Government could take action to ensure a rulebook is there for everyone to follow. Surely it owes it to the Russells to do just that?