13 September 2021

Social media companies must ‘get a grip’ on removing child abuse images

13 September 2021

Social media companies must not encrypt messages unless they can guarantee they can keep platforms free of illegal content, an inquiry has warned.

The All-Party Parliamentary Group (APPG) on Social Media is calling for companies to step up and do more to protect children from online grooming and sexual abuse.

It launched its inquiry into the “disturbing” rise of so-called “self-generated” child sexual abuse material last November.

The cross-party MPs say the Home Office must review legislation to ensure it is as easy as possible for children to have their images removed from the internet.

Self-generated content can include material filmed using webcams, very often in the child’s own room, and then shared online.

In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves.

The report, Selfie Generation – What’s Behind The Rise Of Self-Generated Indecent Images Of Children?, says the trend “seems to have been exacerbated by the Covid-19 crisis”.

Experts believe an increase in the number of offenders exchanging child sex abuse material during lockdowns may stimulate demand beyond the pandemic.

The APPG believes it is completely unacceptable for a company to encrypt a service that has many child users

The MPs say many witnesses “raised very real concerns” about the impact of encryption on child protection, saying it could “cripple” the ability of programmes to detect illegal imagery.

They write: “The APPG believes it is completely unacceptable for a company to encrypt a service that has many child users.

“Doing this would do so much damage to child protection.

“We recommend that technology companies do not encrypt their services until a workable solution can be found that ensures equivalency with the current arrangements for the detection of this imagery.”

Labour MP Chris Elmore, chairman of the APPG, said social media companies must be more proactive in rooting out abusive images, and be clear to young users how they can complain about them.

He said: “It’s high time that we take meaningful action to fix this unacceptable mess.

“Children are daily at real risk of unimaginable cruelty, abuse and, in some instances, death.

“Social media companies are fundamentally failing to discharge their duties, and simply ignoring what should be an obvious moral obligation to keep young users safe.

“They need to get a grip, with institutional re-design, including the introduction of a duty-of-care on the part of companies toward their young users.”

The term “self-generated” should “not be taken to imply that such children have any share in the moral responsibility for their abuse”, he added.

Among 10 recommendations, the report says it should be replaced by “first person produced imagery” to avoid inadvertent victim blaming.

Susie Hargreaves, director of the UK Safer Internet Centre, said: “We see the fallout of abuse and, when children are targeted and made to abuse themselves on camera by criminal adult predators, it has a heart-breaking effect on children and their families.

“There is hope, and there are ways for children and young people to fight back.

“The Report Remove tool we launched this year with Childline empowers young people to have illegal images and videos of themselves removed.”

She added: “New legislation will also help make a difference, and the forthcoming Online Safety Bill is a unique opportunity to make the UK a safer place to be online, particularly for children.”

The best videos delivered daily

Watch the stories that matter, right from your inbox