UK data regulator tackles porn sites over children’s access


The new enforcement plans are a reversal for the ICO, which had previously maintained that services aimed at adults weren’t subject to the Children’s Code or Age Appropriate Design Code, a set of rules that guide how the UK Data Protection Act should be applied to digital services for children. — Dreamstime/TNS

The UK’s Information Commissioner’s Office has pledged to crack down on porn sites and other adult-only services to ensure they are taking steps, such as verifying users’ ages, to prevent children’s access, the regulator said Sept 2.

The new enforcement plans are a reversal for the ICO, which had previously maintained that services aimed at adults weren’t subject to the Children’s Code or Age Appropriate Design Code, a set of rules that guide how the UK Data Protection Act should be applied to digital services for children. It also follows a Bloomberg News report last month that showed the regulator hadn’t enforced a single child-protection case in the two years since the Children’s Code came into effect.

Companies that break the rules can face fines of as much as 4% of annual global revenue.

The data watchdog changed its position after facing a legal challenge from 11 civil-society groups in 2021, said John Edwards, who became Information Commissioner in January 2022. Those groups argued that the wording of the code covers all services “likely to be accessed” by children and not just services aimed at children.

The organisations – including the National Society for the Prevention of Cruelty to Children, children’s charity Barnardo’s and 5Rights, which advocates for child-safe digital design – said that allowing children to access adult-only services poses risks to data-protection in addition to exposing them to harmful content.

“We have revised our position,” Edwards said. “We now accept that if there are a significant number of children accessing the sites, they are in the aegis of the code.”

“We now accept that if there are a significant number of children accessing the sites, they are in the aegis of the code.”

Child online safety groups welcomed the announcement.

“It’s brilliant news if the ICO is going to move against these porn sites who have known for a very long time that what they are doing is wrong, but never felt any legal pressure to do anything,” said John Carr, from the Children’s Charities Coalition on Internet Safety, which spearheaded the legal challenge. “These sites shouldn’t be processing children’s data.”

The ICO will engage with adult-only services, such as Pornhub and xHamster, to make it clear that they must comply with the code by preventing children’s access. If a company can show their services are not accessed by a significant number of children, Edwards said, they would not be subject to the code. The regulator was yet to determine the threshold for what it would consider to be a “significant” number of children, he added.

Pornhub and xHamster did not respond to requests for comment.

Plans to force porn sites to verify the ages of visitors were first introduced in the UK through the Digital Economy Act of 2017, but were dropped two years later after criticism from privacy campaigners over the prospect of a database of porn users. Age assurance for adult sites is also a provision in the Online Safety Bill, a draft regulation that would compel websites to protect children from harmful content, however that won’t be enforced until 2025, according to Ofcom’s road map.

The ICO’s announcement coincides with the second anniversary of the Children’s Code, which came into force in September 2020 with a 12-month grace period.

The regulator said it has been investigating four companies for non-compliance with the code, and auditing an additional nine. A spokeswoman declined to name any of the 13 companies.

Many platforms including Instagram, Facebook, YouTube and Google have made voluntary changes to children’s accounts to comply with the code. Changes include disabling personalised ads, preventing adults from messaging minors, defaulting accounts to private and switching off autoplay.

“The result is that children are better protected online in 2022 than they were in 2021,” said Edwards. – Bloomberg

Article type: free
User access status:
Subscribe now to our Premium Plan for an ad-free and unlimited reading experience!
   

Next In Tech News

Ford's pain underscores uneven impact of two-year auto chip shortage
Russia's Sberbank to launch DeFi platform in coming months - Interfax
'Call of Duty' maker Activision Blizzard to pay $35 million over U.S. SEC charges
Analysis-From Meta to Microsoft, AI's big moment is here
Spanish court rules Amazon 'Flex' couriers were falsely self-employed
Big Tech earnings face more heat as cloud cover fades
Exclusive-ChatGPT in spotlight as EU's Breton bats for tougher AI rules
Man in SG arrested for allegedly using fake OneService Lite QR codes to obtain personal details
Securities fraud trial over Elon Musk's 2018 tweets draws to a close
Italy bans U.S.-based AI chatbot Replika from using personal data

Others Also Read