
Credit: Pixabay/CC0 Public Domain
A new UK age verification measure to prevent children from accessing harmful online content came into effect on harmful online content on Friday, with campaigners celebrating “milestones” in their long-standing battle for stronger regulations.
Under new rules enforced by the UK Media Watchdog, websites and apps that host potentially harmful content will be responsible for age checks using means such as facial images and credit cards.
About 6,000 porn sites have agreed to implement curbs, according to Melanie Dawes, CEO of the British Regulator ofcom.
Other platforms such as X, which face conflict over similar restrictions in Ireland, need to protect children from illegal pornography, hate and violent content, she noted.
“We did work that other regulators didn’t do,” Dawes told BBC Radio.
“These systems could work. We investigated that,” she said.
According to Ofcom, about half a million young people, ages 8 to 14, came across porn online last month.
The much-anticipated new rules aim to prevent minors from encountering content related to suicide, self-harm, eating disorders and pornography, and originated from the 2023 Online Safety Act.
It places liability on high-tech companies, better protect children and adults online, and mandates sanctions for those in short supply.
According to the government, rules breakers face fines of up to £18 million ($23 million) or 10% of global revenue.
It also allows for criminal proceedings against senior management who cannot guarantee that a company will comply with OFCOM information requests.
The measures are now in force after the sector and regulators were given time to prepare.
“Another Internet”
Children are “experienced by another internet for the first time,” technology secretary Peter Kyle told Sky News, adding that there are “very high expectations” for change.
In an interview with the Parenting Forum Mumsnet, he also apologizes to young people who have been exposed to harmful content.
“I would like to apologise to children over the age of 13 who do not have these protections,” Kyle said.
“It’s a really important milestone to see tech companies have to take responsibility for making their children’s services safe,” said Rani Govender of the NSPCC’s Child Protection Charity.
Children often “stumble over this harmful and dangerous content,” she told BBC News.
“There will be a loophole,” Govender pointed out, claiming that “we’ve put in a much stronger rule to prevent it from continuing to happen.”
Kiel’s government prime minister Starmer is also considering introducing a two-hour daily limit for children on social media apps.
Kyle said he would announce more plans to regulate the under-16 sector “in the near future.”
©2025 AFP
Quote: UK will launch online checks to stop children (2025, July 25) accessing harmful content obtained from https://techxplore.com/news/2025-uk-online-children-accessing-content.html on July 25, 2025
This document is subject to copyright. Apart from fair transactions for private research or research purposes, there is no part that is reproduced without written permission. Content is provided with information only.
