The Children's Code, also known as the Age Appropriate Design Code, provides protection for children up to age 18 when they visit websites or download an app or game.
It contains 15 standards that companies will have to consider when designing their service or products, especially those aimed at children.
As a risk-based code, not all organisations will be expected to implement the standards as stringently - those organisations which use, analyse and profile children's data will have the most to do.
The new statutory code aims to put children's privacy at the heart of online services, pushing organisations to recognise children require special protection when online.
15 standards cover different elements of age appropriate design, specifically focused on services and products targeted at children.
These include high privacy default settings and a commitment to collect only the amount of data needed to provide the service which the child is participating in - i.e. no collecting data from an app to influence something completely different.
Administered by the Information Commissioner's Office (ICO), the code applies to all major social media and online services used by children in the UK.
The standards are legally enforceable and the ICO has the power to impose fines of up to 4% of global turnover or £17m on a data controller.
The 15 Children's Code standards which designers have to consider are:
Organisations have 12 months to transition to the new arrangements, and the ICO will use their feedback to create packages of support to help others adapt their online products and services by September 2021.
As we explain in our guide to whether the Government can protect children online, there is a long-running debate about how the Government can and should protect kids when they're browsing and gaming online.
Back in 2015, Sky became a pioneer of switching parental controls on by default with their Sky Broadband Shield after it was revealed that only 4% of households were activating network level parental controls when they signed up to a broadband deal.
The ICO's new code takes a different approach by forcing organisations to be proactive about child privacy when they're designing a product or service rather than leaving every judgement to the end user - children or their parents.
The fact that organisations who target children need to do more than those who don't makes sense, yet it sits uncomfortably alongside the Government's decision to scrap pornography age verification proposals in October 2019.
As such sites aren't targeted at children, they won't necessarily be covered under the new ICO rules, although they are supposed to employ age verification techniques anyway.
Notably, the new rules do cover social media websites which weren't covered in the age verification proposals. Read our guide on keeping kids safe on social media.
Get insider tips and the latest offers in our newsletter
We are independent of all of the products and services we compare.
We order our comparison tables by price or feature and never by referral revenue.
We donate at least 5% of our profits to charity, and we have a climate positive workforce.