Ofcom announces new rules to force tech firms to keep children safe online | Internet safety

Ofcom announces new rules to force tech firms to keep children safe online | Internet safety

Social media and other internet platforms will be legally required to block children’s access to harmful content from July or face large fines, Ofcom has announced.

Tech firms will have to apply the measures by 25 July or risk fines – and in extreme cases being shut down – under the UK’s Online Safety Act.

The communications watchdog published more than 40 measures on Monday covering sites and apps used by children, ranging from social media to search and gaming.

Under the measures, the “riskiest” services, which include big social media platforms, must use “highly effective” age checks to identify under-18 users; algorithms, which recommend content to users, must filter out harmful material; all sites and apps must have procedures for taking down dangerous content quickly; and children must have a “straightforward” way to report content.

Melanie Dawes, Ofcom’s chief executive, said the changes were a “reset” for children online and that companies failing to act would face enforcement.

“They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content,” she said.

The measures were published as the technology secretary, Peter Kyle, said he was considering a social media curfew for children after TikTok’s introduction of a feature that encourages under-16s to switch off the app after 10pm.

Kyle told the Telegraph he was “watching very carefully” the impact of the wind-down feature.

“These are things I am looking at. I’m not going to act on something that will have a profound impact on every single child in the country without making sure that the evidence supports it – but I am investing in [researching] the evidence,” he said.

Kyle added on Thursday that the new Ofcom codes should be a “watershed moment” that turned the tide on “toxic experiences on these platforms”.

“Growing up in the digital age should mean children can reap the immense benefits of the online world safely, but in recent years too many young people have been exposed to lawless, poisonous environments online which we know can lead to real and sometimes fatal consequences. This cannot continue,” he added.

Online platforms will be required to suppress the spread of harmful content, such as violent, hateful or abusive material and online bullying. More seriously harmful content, including that relating to suicide, self-harm and eating disorders, will need to be kept off children’s feeds entirely, as will pornography.

skip past newsletter promotion

The online safety campaigner Ian Russell, whose 14-year-old daughter, Molly, ended her life after viewing harmful content online, said the codes were “overly cautious” and put tech company profit ahead of tackling harmful content.

Russell’s Molly Rose Foundation charity argues the codes do not go far enough to moderate suicide and self-harm content as well as blocking dangerous online challenges.

He said: “I am dismayed by the lack of ambition in today’s codes. Instead of moving fast to fix things, the painful reality is that Ofcom’s measures will fail to prevent more young deaths like my daughter Molly’s.”

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *