15 December 2020

More steps needed to protect young from self-harm on social media – psychiatrist

15 December 2020

New laws forcing social networks to act on harmful material shared on their platforms do not go far enough in protecting vulnerable children from self-harm, a leading psychiatrist has warned.

The long-awaited online harms rules, which will be set out on Tuesday, threaten tech giants with multimillion-pound fines and sites being blocked in the UK if they fail to comply.

While social media sites and apps which host user-generated posts will be expected to remove or limit the spread of suicide content, the Government said it is still working with the Law Commission on whether the promotion of self-harm should be made illegal.

Fears about the impact of social media on vulnerable people have increased in recent years amid cases such as that of 14-year-old schoolgirl Molly Russell, who took her own life in 2017 and was found to have viewed harmful images online.

Molly Russell (PA Media)

Molly’s father, Ian, who now campaigns for online safety, has previously said the “pushy algorithms” of social media “helped kill my daughter”.

Dr Bernadka Dubicka, chair of the child and adolescent faculty at the Royal College of Psychiatrists, said: “These new rules are a welcome first step but more needs to be done to protect vulnerable children from content that could lead to self-harm or suicide, and also the algorithms that push them to such material.

“As a frontline child psychiatrist, I’m seeing more and more young people affected by harmful online content.

“The Government must make it illegal for companies to allow or promote self-harm content and ensure that breaking the law leads to stiff penalties.

“The Government should rethink their plans and work to include a proposal to compel social media companies to hand over anonymised data to researchers in the legislation.”

More widely, the move has been welcomed for stepping up efforts against dangerous material such as terrorism and child sexual abuse.

“I am pleased that the Government has finally confirmed that it plans to introduce a new duty of care,” said Anne Longfield, Children’s Commissioner for England.

“The signs are that this regulation will have teeth, including strong sanctions for companies found to be in breach of their duties, and a requirement on messaging apps to use technology to identify child abuse and exploitation material when directed to by the regulator.

We need to make sure global efforts to fight child sexual abuse and exploitation material are linked up

“However, much will rest on the detail behind these announcements, which we will be looking at closely.”

Susie Hargreaves, chief executive of the Internet Watch Foundation charity which works to remove child sexual abuse images and videos from the internet, warned that regulation alone will not be enough to tackle the problem.

“We welcome the move to introduce a new duty of care and the interim code of practice for tech companies,” she said.

“Regulation alone won’t solve this problem. We need to make sure global efforts to fight child sexual abuse and exploitation material are linked up, and that there is an international response to this most international of threats.”

NSPCC chief executive Peter Wanless said the charity will be “closely scrutinising the proposals”, while Barnardo’s chief executive Javed Khan warned “the devil is in the detail”.

The best videos delivered daily

Watch the stories that matter, right from your inbox