PEOPLE often talk about the internet as the Wild West. Now, following the death of 14-year-old Molly Russell, it looks like the sheriff has just rode into town. Yesterday, the digital industries minister, Margot James, told social media companies that they’re no longer above the law. But she’s going to need a very big gun to take on internet behemoths like Facebook and Google.

It’s surely not before time. The succession of social media scandals goes on and on. Fake news, Cambridge Analytica, electoral manipulation, Russian trolls, filter bubbles, hate speech, misogynistic abuse, revenge porn, data theft, invasion of privacy, tax avoidance and much else. By signing up to social media we’ve helped create an unaccountable algorithmic Frankenstein, the functioning of which even its makers don’t fully understand.

But the effects are only too obvious. Social media use has poisoned public life with anger and division, largely destroyed conventional journalism, created a generation of mobile phone addicts and now represents a very real and present danger to our children.

The Princes Trust reported this week that the number of young people who believe “life is not worth living” has doubled in a decade. There are disturbing upward trends for suicide among teenagers. And while social media cannot be blamed entirely for this, it has undoubtedly contributed. As Molly Russell discovered, self-harm and suicide groups are just a mobile phone click away on Instagram. Abuse and bullying on social media continues unabated. Many young people experience intense loneliness and alienation as their social relationships become increasingly competitive and image-based.

In what other walk of life would commercial companies like Facebook be permitted to get away with this? Governments have been understandably reluctant to place regulatory handcuffs on web companies. Until very recently, they were considered a social good.

Silicon Valley types don’t look like predatory capitalists, after all. They say they just want to help people “get connected”. Google’s mission statement used to be “don’t do evil”. The internet has been of immense utility to many people, for keeping in touch, organising campaigns and as a source of information. “To Google” has even become a verb.

However, we can no longer ignore the dark side of the web. Later this month, the UK Government is promising a White Paper on regulation of social media. Taming the Wild West web is going to be one of the great challenges of the 21st century. And it’s not clear that national governments, on their own, are up to it.

Global entities like Facebook, which owns Instagram, possess immense wealth, lobbying power and operate across jurisdictions. Like the banks before the financial crash, they think that they are “too big to fail”. Complacent executives believe they are managing borderless entities too nebulous to be regulated. However, they may have to think again. The legal immunities they posses were a result of legislation, and laws can be changed.

Yesterday, Ms James declared that social media can no longer rely on “legal protection from liability for user-generated content”. This derives initially from the United States, where a law, Section 230, was passed in 1996 exempting internet companies from responsibility for material posted on their websites. So long as they didn’t actually generate the content themselves, they would not be regarded as conventional publishers, and therefore legally responsible for the material. Similar “safe haven” legislation was passed in Europe.

This has long infuriated newspapers, which continue to be responsible for material posted on their sites even when they don’t generate it. Treating Facebook and Twitter as “platforms” rather than publishers has created many legal anomalies. Following the arrest of Alex Salmond last month, newspapers religiously abided by the Contempt of Court Act and blocked any comment on the case on their websites. Twitter happily carried on posting people’s views for anyone to read.

When Lord McAlpine sued folk on social media who’d falsely suggested he was a paedophile, the individual Tweeters, including the wife of the Commons Speaker, were sued for defamation. But Twitter itself emerged scot-free. If the same remarks had appeared on a newspaper website, the editor and publishers could have been sued.

It is easy to understand why, in the early days of the internet, regulators were cautious about imposing legal constraints on the new and exciting means of communication. Defenders of freedom of speech – of whom I am one – argued that it would be wrong to inhibit the free flow of information and opinion. Companies like Facebook insisted that they were communication companies, like BT. You wouldn’t expect BT to be responsible for everything that was said on a telephone conversation.

But this was never a sustainable argument. First of all, conversations on the telephone are private. British Telecom does not reproduce these conversations on a website listened to by millions of people. Nor does it use the content of conversations to sell ads.

Moreover, Facebook does increasingly take editorial responsibility for what it publishes. Its infamous algorithms monitor your searches and likes and then serve up content on the news feed which corresponds to what it thinks will interest you. It thus curates and edits the news, even if it doesn’t have an editor sitting behind a desk crying “hold the front page”.

Moreover, a succession of scandals over terrorist recruitment ads, hate speech and now child abuse has forced Facebook to agree to take down objectionable content. It hires lots of people to do this – though clearly they don’t do a very good job. They lack incentive.

Of course, it is a matter of scale. Facebook and Google are the biggest publishers on the planet with millions of posts going up every day. But they are also the wealthiest companies on the planet. Facebook’s profits went up 61 per cent last year. They earn billions from the adverts they place around articles. In essence it is very simple: if impoverished newspapers can moderate content on their sites, then so can Facebook.

They might as well accept the inevitable. Public opinion will no longer tolerate internet plutocrats washing their hands of the victims of social media. People will ask: how many more children have to die before these people realise that they have a duty of care?

Read more: Instagram 'helped kill my daughter', says grieving father