Tech platforms could possibly be compelled to forestall unlawful content material from going viral and restrict the power for individuals to ship digital items to or document a toddler’s livestream, beneath extra on-line security measures proposed by Ofcom.The UK regulator printed a session on Monday looking for views on additional protections to maintain residents, notably kids, safer on-line.These might additionally embrace making some bigger platforms assess whether or not they should proactively detect terrorist materials beneath additional on-line security measures.Oliver Griffiths, on-line security group director at Ofcom, stated its proposed measures search to construct on present UK on-line security guidelines however sustain with “continuously evolving” dangers.”We’re holding platforms to account and launching swift enforcement motion the place we’ve got issues,” he stated.”However know-how and harms are continuously evolving, and we’re at all times how we will make life safer on-line.”The session highlighted three important areas wherein Ofcom thinks extra could possibly be finished:stopping unlawful content material going viraltackling harms at sourcegiving additional protections to childrenThe BBC has approached TikTok, livestreaming platform Twitch and Meta – which owns Instagram, Fb and Threads – for remark.Ofcom’s vary of proposals goal a lot of points – from intimate picture abuse to the hazard of individuals witnessing bodily hurt on livestreams – and range in what sort or dimension of platform they may apply to.For instance, proposals that suppliers have a mechanism to let customers report a livestream if its content material “depicts the chance of imminent bodily hurt” would apply to all user-to-user websites that permit a single person to livestream to many, the place there could also be a threat of displaying criminal activity.In the meantime potential necessities for platforms to make use of proactive know-how to detect content material deemed dangerous to kids, would solely apply to the most important tech corporations which current larger dangers of related harms.The proposals put ahead by Ofcom look to increase upon the measures already in place to attempt to enhance on-line security.Some platforms have already taken steps to attempt to clamp down on options that specialists have warned could expose kids to grooming, reminiscent of via livestreaming.In 2022, TikTok banned kids raised its minimal age for going reside on the platform from 16 to 18 – shortly after a BBC investigation discovered lots of of accounts going reside from Syrian refugee camps with kids begging for donations.YouTube not too long ago stated it might enhance its threshold for customers to livestream to 16, from 22 July.However some teams say the regulator’s potential new necessities spotlight core points with the On-line Security Act – the UK’s sweeping guidelines that Ofcom is tasked with imposing.”Additional measures are at all times welcome however they won’t handle both the systemic weaknesses within the On-line Security Act,” stated Ian Russell, chair of the Molly Rose Basis – an organisation arrange in reminiscence of his 14-year-old daughter Molly Russell, who took her personal life after viewing hundreds of photographs selling suicide and self-harm.”So long as the main focus is on sticking plasters not complete options, regulation will fail to maintain up with present ranges of hurt and main new suicide and self-harm threats,” Mr Russell stated.He added that Ofcom confirmed a “lack of ambition” in its method to regulation.”It is time for the prime minister to intervene and introduce a strengthened On-line Security Act that may sort out preventable hurt head on by totally compelling corporations to establish and repair all of the dangers posed by their platforms.”Leanda Barrington-Leach, government director of kids’s rights charity 5Rights, stated the regulator ought to require corporations to “assume extra holistically” about safeguards for youngsters, reasonably than mandate “incremental adjustments”.”Youngsters’s security ought to be embedded into tech corporations’ design of options and functionalities from the outset,” she stated.However the NSPCC’s Rani Govender stated Ofcom’s transfer to require extra safeguards for livestreaming “might make an actual distinction to defending kids in these high-risk areas”.The session is open till 20 October 2025 and Ofcom hopes to get suggestions from service suppliers, civil society, regulation enforcement and members of the general public.Extra reporting by Chris Vallance
Trending
- Wonder Dynamics co-founder Nikola Todorovic joins Disrupt 2025
- Today’s NYT Connections: Sports Edition Hints, Answers for July 5 #285
- Fans on cloud nine after Oasis reunion tour kicks off in Cardiff
- Rachel Reeves’ five choices to turn government finances around
- 5 Lenses That Were Legendary But Are Now Forgotten
- How Megan Thee Stallion Brought Hot Girl Summer to Love Island USA
- ex-Janus Henderson analyst jailed over WFH insider trading
- How to Choose the Right Soundbar (2025): Size, Price, Surround Sound, and Subwoofers