Technology

The Guardian’s view on social media regulation: The internet must be safer | Editorial

Whappened with Molly Russell, who took his own life at the age of 14, was a tragedy. Everything must be done to prevent similar events in the future. Molly’s death was linked to internet use by a coroner after she appeared in a London court to have been inundated with algorithmically driven self-harming material. On the principle that it was wrong and that social media companies need to take more responsibility for what happens on their platforms, there is a good deal across party lines in the Norwegian Parliament as well among the public.

But beyond such painful case studies and some generalizations drawn from them, the consensus begins to break down. It is difficult to legislate to solve the many problems created or exacerbated by social media. Digital technologies move quickly and unpredictably. So far, societies and governments have failed to limit their harmful and destructive uses while exploiting creative and productive ones. The UK’s online security bill, which returns to the House of Commons on Tuesday, has already been several years in the making, and further changes are needed before it becomes law. Even then, no one should imagine that this is work done. Instead, the bill must be seen as an awkward step on an arduous journey.

The weight of the new law was significantly altered by last year’s decision to weaken duties related to protecting adults from harmful content and focus on children. This was driven by concerns about free speech, particularly the contested nature of “hate” and who should define it. For now, the broader degradation of the public realm of social media, including the amplification of offensive language and imagery, will remain unchallenged. Labour’s Lucy Powell has already said that if the government rejects changes aimed at increasing the accountability of tech companies, Labor will seek to do so in the future.

There is a good chance that an amendment called for by backbenchers, which makes individual managers criminally liable for breaches of child protection, will be accepted. It is important, not least to send a clear signal that the public will understand. But the prospect of prosecution must be part of a wider framework of sanctions that force digital companies to put children’s safety first. Until now they have gotten away with treating this as someone else’s problem.

It’s only because the coroner in Molly Russell’s case forced Meta and others to prove that we know what we’re doing about her abuse. It took the courage of a whistleblower, Frances Haugen, to reveal that Facebook (now Meta) knew that Instagram was make teenage girls feel worse about their bodies. This week Ian Russell, Molly’s father, described the platforms’ response to the coroner’s announcement in his daughter’s case as “business as usual”.

This laissez-faire approach must end. While the Conservative Party was distracted by infighting throughout most of 2022, children using the internet faced what Peter Wanless, the head of the NSPCC, calls “sexual abuse on an industrial scale”. Ofcom needs new powers to act for bereaved parents. Individuals must be empowered to file complaints. And the bill must be comprehensive so that platforms cannot determine themselves and their algorithms outside its scope.

Media companies have been associated with harm as well as good in the past. Never before have they pushed themselves so aggressively towards children while being so careless about the effects. The online security bill must rewrite the rules and deliver an ultimatum.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button