The ICO said the code was informed by views and evidence gathered from designers, app developers, academics and civil society. It also spoke to 280 children as part of its research. The proposals build upon the safeguards enshrined in the EU’s GDPR legislation. Therefore, those who fail to comply by the safety standards would face the same punishments, including fines of up 20 million Euros ($23 million) or 4 percent of a company’s annual global turnover, depending on which is higher.
The ICO previously handed Facebook a £500,000 fine for its part in the Cambridge Analytica scandal. The UK has also promised to hold internet platforms accountable for the content published on their sites — much to the disdain of privacy advocates who’ve warned the vague rules could be used to quash freedom of expression.
How the ICO plans to enforce these latest proposals also remains unclear. A consultation on the draft regulations will last until the end of May, and the final version of the code of practice is expected to come into effect by 2020.
This isn’t the first time social media’s engagement techniques have come under fire in the UK. Last year, a number of ex-Facebook staffers — including Leah Pearlman, co-inventor of Facebook’s Like button — told the BBC the platform had deliberately created features to keep users addicted to its app. Facebook denied the allegations.
UK police have also previously raised child safety concerns over Snapchat’s Snap Maps. To comply with the EU’s GDPR law, Snap said it would no longer store the location history of under-16s. The company’s senior director of international public policy, Stephen Collins, also recently admitted to the UK parliament that its age-verification system wasn’t “foolproof.”