The UK data protection official today published a number of design standards for Internet services that are intended to help protect children's privacy on the Internet.
The Information Commissioner's Office (ICO) has developed the Age Appropriate Design Code since the domestic data protection law was updated in 2018 – as part of a government push to create "world-leading" standards for children when these are online.
British lawmakers are increasingly concerned about the "data transmission" of children when they go online and may be too young to be tracked and profiled under current European data protection law.
The ICO code consists of 1
Profiling should also be disabled by default. The code also targets dark user interface patterns that are used to manipulate user actions against their own interests. However, it should not be said that “kick-off techniques” should be used to “guide or encourage children to provide unnecessary personal information or to weaken or disable their privacy protection”. The focus is on providing default settings that ensure that children have the best possible access to online services and at the same time minimize data collection and use as standard, "the supervisory authority writes in a summary. The corresponding design code is aimed at protecting children. It applies to a very wide range of online services. The regulator notes that "the majority of online services used by children are covered" and that "this code applies when children are affected". probably use your service. "[emphasis ours].
This means that it can be applied to everything from games to social media platforms to fitness apps to Ed educational websites and on-demand streaming services – provided they are available to users in the UK.
“We believe that a service is more likely to be accessed [by children] than not. This recognizes Parliament's intention to cover services that children actually use, but does not extend the definition to cover all services that children may have access to, ”the ICO adds:
- The good Child's: The child's best interests should be paramount in the design and development of online services that a child is likely to have access to.
- Privacy Impact Assessments: Undertake a DPIA to assess and mitigate the risks to the rights and freedoms of children who are likely to access your service that arise from your data processing. Consider different ages, capacities and development needs and make sure that your DPIA complies with this code. Age-appropriate use: Use risk-based to identify the age of individual users and ensure you effectively apply the standards in this code to child users. Determine age with certainty commensurate with the risks to the rights and freedoms of children arising from your data processing, or instead apply the standards in this code to all of your users.
- Transparency: Privacy information that you provide to users and other published terms, guidelines, and community standards must be accurate, clear, and in a language appropriate for the child's age. Provide additional, bite-sized explanations of how you use personal information at the time of activation.
- Malicious use of data: Do not use children's personal information in a manner that has been shown to be harmful to your wellbeing or that is contrary to industry standards of conduct, other government regulations, or recommendations.
- Policies and Community Standards: Comply with your own published terms, policies, and community standards (including, but not limited to, privacy policies, age) restrictions, codes of conduct, and content policies.
- Default Settings: The settings must be "high privacy" by default (unless you can demonstrate a compelling reason for another default setting taking into account the child's interests).
- Data minimization: Collect and store only the minimal amount of personal information you need to provide the elements of your service in which a child is active and knowledgeable I got engaged. Give children a separate selection of which items they want to activate.
- Data exchange: Do not pass on any child data unless you can prove a compelling reason for this, taking into account the child's best interests.
- Geolocation: Disable the geolocation options by default (unless you can demonstrate a compelling reason that geolocation is enabled by default, taking into account the interests of the child). Give an obvious sign to children when location tracking is active. Options that make a child's position visible to others must be set to "Off" at the end of each session.
- Parental controls: If you provide parental controls, tell the child their age. If your online service allows a parent or caregiver to monitor their child's online activity or to track their location, give the child an obvious sign when they are being monitored.
- Profiling: Set options that use profiling to the "off" default setting (unless you can demonstrate a compelling reason that profiling is enabled by default, taking into account the interests of the child) , Only allow profiling if you have taken appropriate measures to protect the child from harmful effects (in particular from feeding ingredients that have an adverse effect on the child's health or well-being). Guide or encourage children to use unnecessary personal data Provide data or weaken or disable the protection of your privacy.
- Connected toys and devices: If you provide a connected toy or device, make sure that you provide effective tools to enable compliance with this code.  Online Tools: Provide well-known and accessible tools that children can use to exercise their privacy rights and report concerns.
The Age Appropriate Design Code also defines children as under the age of 18 – this offers a higher benchmark than the current one. For example, the UK Data Protection Act only sets an age limit of 13 so that children can legally give their consent to online tracking.
So – assuming (very wild) – that internet services should suddenly choose to follow the code on the letters, disable tracking by default and not persuade users to weaken privacy standards by being manipulated to Revealing more data, the code could – theoretically – increase the level of confidentiality that both children and adults typically receive online.
However, this is not legally binding – so there is a pretty big chance.
Although the regulator indicates that the standards are supported in the code by Existing data protection laws that are regulated and legally enforceable (and that contain clear principles such as “Privacy by Design and Default”). They indicate that they have the authority to take action against violations of the law, including "severe sanctions" such as orders to stop the processing of data and fines from up to 4% of a company's global sales.
In a way, the regulator seems to say, "Do you feel luc ky data punk? "
In April last year, the UK government published a white paper proposing to regulate a number of online damages, including whether children can access inappropriate material online.
The Age Appropair Design Code of the ICO is said to be support these efforts, so there is also a possibility that some of the same provisions will be included in the proposed online claims statement.
"This is not and will not be a law. It is only a code of conduct," explains Neil Brown , Internet, Telecommunications and Technology lawyer at Decoded Legal, the likely impact of the proposed standards. "It shows the direction of the ICO's thinking and expectations, and the ICO must take them into account when taking enforcement action, but it is not something that an organization as such must adhere to. You must comply with the law that includes the GDPR [General Data Protection Regulation] and the data protection authority [Data Protection Act] 2018.
"The Code of Conduct is subject to the Data Protection Agency 2018, so companies that fall within the scope of this policy may want to understand what it says. The DPA 2018 and the British GDPR (the version of the GDPR that will exist after Brexit) include UK-based air traffic controllers and foreign air traffic controllers who provide services to people in the UK or monitor the behavior of people in the UK. Simply providing a service to people in the UK should not be enough. "
" Overall, this corresponds to the general direction of travel for online services and the view that more needs to be done to protect children online. “Brown also told us.
“At the moment, online services should find out how they comply with the GDPR, the ePrivacy rules and all other applicable laws. The obligation to comply with these laws does not change due to today's code of conduct. Rather, the code of conduct shows how the ICO thinks about what compliance could look like (and may also gold-plate some of the legal requirements). “
Organizations that acknowledge the code – and are in a position to demonstrate that they have followed the standards – have a better chance of convincing the regulator that they have the relevant data protection laws has complied, said Brown.
“Vice versa, if you want to say that you comply with the law. But not with the code that is (legally) possible, but rather with the commitment to the ICO. “
When the government zoomed back last fall, it said it was committed to publishing a draft. Online law is damaging pre-legislative control legislation" at full speed ".
At the same time, however, a controversial plan that was included in digital legislation in 2017 and that would have made age control compulsory for access to online pornography – so to speak – is developing towards a "most comprehensive approach to protecting children" concentrate, ie via the online claims settlement.
It remains to be seen how comprehensive the advertised "child protection" will ultimately be.
Brown suggests age The review could turn out to be a “general requirement” because the age verification component of the Digital Economy Act 2017 has been deleted – and “the government has announced that it will be included in the broader online damage report”.
The government He also consulted with technology companies about possible ways to implement age verification online.
The difficulties in regulating repetitive Internet services – many of which are operated by companies outside the United Kingdom – have been a major concern for years. (And are now caught up in geopolitics.)
While enforcement of existing European digital privacy protection laws, to put it politely, is still in progress …