A third of cyber attacks exploit unsecure remote working
Report claims business leaders are failing to educate employees about cyber security risks
'I believe it will be transformational,' says information commissioner.
The Information Commissioner’s Office (ICO), an independent regulatory office that aims to “uphold information rights in the public interest”, has published a new Age Appropriate Design Code to protect children online.
The code, the draft of which was first reported in April 2019, is scheduled to come into force by autumn 2021.
It consists of 15 measures that it states will provide better protection for young people when they are spending time online, whether they are using apps, browsing social media platforms or playing online games.
Here are the 15 measures that are being put into place as part of the ICO’s Age Appropriate Design Code:
1. Best interests of the child
In accordance with the United Nations Convention on the Rights of the Child, the age appropriate code emphasises that the “best interests of the child should be a primary consideration”.
2. Data protection impact assessments
This measure outlines that data protection impact assessments must be undertaken by tech firms in oder to “identify and minimise the data protection risks of your service – and in particular the specific risks to children who are likely to access your service which arise from your processing of their personal data”.
3. Age-appropriate application
The ICO states that online companies must take the age range of their users into the account, explaining that assessing the individual needs of children at various stages of development “should be at the heart of how you design your service and apply this code”.
4. Transparency
Transparency, the code states, “is about being clear, open and honest with your users about what they can expect when they access your online service”.
The ICO adds that acting in a transparent manner is already outlined as part of GDPR, explaining that it is essential when processing people’s personal data.
5. Detrimental use of data
According to the Age Appropriate Design Code, it is important for companies to refrain from using data “that is obviously detrimental to children’s physical or mental health and wellbeing or that goes against industry codes of practice".
6. Policies and community standards
This measure stated that when a tech firm has already published community rules and conditions, it is vital that they stick to these regulations.
“Keeping to your own standards should also benefit you by giving children and their parents confidence that they can trust your online service with their personal data,” the ICO says.
7. Default settings
The code states that the default privacy settings implemented by tech companies should be set at an “appropriate” manner for children.
8. Data minimisation
Data minimisation, the ICO explains, means “collecting the minimum amount of personal data that you need to deliver an individual element of your service”.
“It means you cannot collect more data than you need to provide the elements of a service the child actually wants to use,” the organisation adds.
9. Data sharing
The code outlines that taking data sharing into consideration is especially important when it comes to children, as sharing children’s personal data could put them at risk.
“The best interests of the child should be a primary consideration for you whenever you contemplate sharing children’s personal data,” it states.
10. Geolocation
The ICO stresses that the use of children’s geolocation data is “of a particular concern”, as having access to the location of a child could pose a threat to their “physical safety”.
“In short it can make children vulnerable to risks such as abduction, physical and mental abuse, sexual abuse and trafficking,” the office writes.
11. Parental controls
The ICO explains that is an online company utilises parental controls, then the child should be made aware of the controls that are in place to regulate their online activity.
12. Profiling
The code says that profiling – which is “any form of automated processing of personal data consisting of the use of person data to evaluate certain aspects relating to a natural person” – should only be permitted if a company has enforced “appropriate measures” to protect child users.
13. Nudge techniques
Nudge techniques, the ICO explains, are online cues which influence how a user may use a website, such as by encouraging them to click large, colourful buttons.
The organisation states that nudge techniques could be used to encourage children to “select less privacy-enhancing choices when personalising their privacy settings”, thus putting them and their personal data at greater risk.
14. Connected toys and devices
Some children’s toys and devices are designed to be able to connect to the internet, a feature that the ICO says raises “particular issues” due to “their scope for collecting and processing personal data”.
15.Online tools
Online tools are “mechanisms to help children exercise their rights simply and easily”, the code outlines.
https://www.independent.co.uk/life-style/health-and-families/online-safety-children-ico-social-media-age-appropriate-kids-a9296261.html