Security

Terror watchdog condemns WhatsApp for lowering UK users’ minimum age to 13 | WhatsApp


The UK’s terror watchdog has criticised Mark Zuckerberg’s Meta for lowering the minimum age for WhatsApp users from 16 to 13, warning that the “extraordinary” move could expose more teenagers to extreme content.

Jonathan Hall KC said more children could now access material that Meta cannot regulate, including content related to terror or sexual exploitation.

Head and shoulders picture of Jonathan Hall
Jonathan Hall said the move was ‘an extraordinary thing to do’.

Hall, the independent reviewer of terrorism legislation, told LBC radio that the use of end-to-end encryption on WhatsApp – which means only the sender and receiver can see the messages on the app – left Meta unable to take down dangerous material.

“So by lowering the age of the user from 16 to 13 for WhatsApp, effectively they’re exposing three more years within that age group … to content that they cannot regulate,” he said. “So, to me, that’s an extraordinary thing to do.”

Hall added that children had become increasingly susceptible to terror content, following a record number of arrests last year.

“We’ve had 42 children arrested last calendar year. It’s a huge number, biggest ever. It’s now clear that children particularly susceptible to terror content, children who are particularly unhappy … they’re a round peg in a square hole,” he said. “They are looking for meaning in their lives and they find it. And it could be an extremist identity.”

WhatsApp announced the age change for the UK and EU in February and it came into force on Wednesday. The platform said the change brought the UK and EU age limit in line with other countries, and that protections were in place.

However, child safety campaigners also criticised the decision. The group Smartphone Free Childhood said the move “flies in the face of the growing national demand for big tech to do more to protect our children”.

Concerns over illegal content on WhatsApp and other messaging platforms made end-to-end encryption a battleground in the Online Safety Act, which empowers the communications regulator, Ofcom, to order a messaging service to use “accredited technology” to look for and take down child sexual abuse material.

skip past newsletter promotion

The government has attempted to play down the provision, saying Ofcom would only be able to intervene if scanning content was “technically feasible” and if the process met minimum standards of privacy and accuracy.

In December, Meta announced it was rolling out end-to-end encryption on its Messenger app, with Instagram expected to follow.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.