Tech firms have been put on a year’s notice to introduce reforms that will protect children from harmful content – or face multi-million pound fines.
Elizabeth Denham, the Information Commissioner, has told firms including Facebook, Google and Twitter they have a year to ensure they adhere to a new legally-enforced code that bars them from serving children any content that is “detrimental to their physical, or mental health or well being.”
The Government-backed code will be enforced by fines potentially worth billions of pounds and is designed to prevent a repeat of the case of Molly Russell, the 14-year-old who killed herself after viewing self-harm images on Instagram and other sites.
It will also require the companies to safeguard children’s privacy to prevent them being groomed by paedophiles, to curb “addictive” features like notifications that keep them online and to restrict the firms’ from using personal information for commercial ends.
In what is significant milestone in The Daily Telegraph’s campaign for duty of care laws, Ms Denham said: “A generation from now we will all be astonished that there was ever a time when there wasn’t specific regulation to protect kids online.
“This code makes clear that kids are not like adults online, and their data needs greater protections. We want children to be online, learning and playing and experiencing the world, but with the right protections in place.
“We do understand that companies, particularly small businesses, will need support to comply with the code, and that’s why we have taken the decision to give businesses a year to prepare, and why we’re offering help and support.”
Central to the “age appropriate” code is the requirement that platforms must ensure children do not have access to adult material even if they change their default settings
It covers everything from apps and connected toys, to social media sites and online games, and even educational websites and streaming services.
Firms that breach the code after September 2 2021 face enforcement action which includes compulsory audits, orders to stop processing children’s data and fines of up to four per cent of global turnover. This would mean almost £2 billion for Facebook and £5 billion for Google.
Profiling – where algorithms use a child’s online history to target them with content they might like – must be switched off by default and can only be used if the firms have measures to protect children from harmful content.
Privacy settings must be set to “high” by default. Geolocators should be off by default, as should be optional uses of personal data, behavioural advertising and data sharing.
The code will also curb the firms’ use of “sticky” features such as notifications, continuous scrolling, autoplay or reward loops in games or videos to encourage children to stay online for hours.
It says they must avoid “using personal data in a way that incentivises children to stay engaged” or “automatically extend use instead of requiring children to make an active choice about whether they want to spend their time in this way.”
It will require the introduction of pause buttons which allow children to take a break at any time without losing their progress in a game, or pop-up warnings to halt them.
Features which “use personal data to exploit human susceptibility to reward, anticipatory and pleasure seeking behaviours, or peer pressure” will have to be switched off by default.