Age-appropriate design
The Information Commissioners Office (ICO) is the UK’s independent authority set up to uphold information rights in the public interest, promoting openness by public bodies and data privacy for individuals and they just released new practical guidance for age-appropriate design for apps and services likely to be accessed by children, and that includes educational games, so this is important to us as developers. The document is worth the read but I have added some details below for each of the code standards as to how they might apply to educational game developers.
The Code Standards
#1 Best interests of the child
The best interests of the child should be a primary consideration when you design and develop online services likely to be accessed by a child.
I’m glad they put this first, but I’m sure it also seems a bit of a no-brainer for anyone reading this. Of
#2 Age-appropriate application
Consider the age range of your audience and the needs of children of different ages. Apply the standards in this code to all users, unless you have robust age-verification mechanisms to distinguish adults from children.
We might initially think that this is saying, make sure your content is appropriate for the target age group, but this is really more about assuming that the user is as young as the youngest target user, so that you make the best decisions about their privacy and data. So if you have a game that is played by all sorts of age groups, but you are targeting 15-20 year-olds, you might build an app that considers a 15 year-old as the youngest user. This guidance is to remind you that if you do not have robust age verification, that you should assume that some of your users will be younger than 15, and include users that are below 13 when you build your user interface.
#3 Transparency
The privacy information you provide to users, and other published terms, policies and community standards, must be concise, prominent and in clear language suited to the age of the child. Provide additional specific ‘bite-sized’ explanations about how you use personal data at the point that use is activated.
You should already have a lengthy privacy policy, and you should link from your app, your website and any web stores that you are in. In addition to that, this guidance is to add language that the users of your app can understand when they make choices. This means that any time a user makes a choice to change their privacy settings or to publicly publish information about themselves you should add a dialog to your app that informs them of their actions and verifies that they understand their action.
I personally love this, and I love seeing it in applications! When I find an app that values my privacy, I’m much more likely to trust them, and do business with them. This is a great way to build trust with your users. It is unfortunate when parental privacy settings are just checkboxes and short sentences that most users can not understand. Not only does it limit the parent’s ability to manage their children’s privacy, but it also makes them less likely to manage privacy at all. This is why this guideline is so important.
#4 Detrimental use of data
Do not use children’s personal data in ways that have been shown to be detrimental to their wellbeing, or that go against industry codes of practice, other regulatory provisions or Government
advice .
At first this sounds obvious, don’t harm your users. But again, they give very specific advice here and this is one that can be confusing. For instance, they state:
Strategies used to extend user engagement sometimes referred to as ‘sticky’ features include mechanisms such as reward loops, continuous scrolling, notifications and auto-play features which encourage users to continue playing a game, watching video content or otherwise staying online.
This can feel like a very thin line between building a fun engaging app and protecting your users best interests. Most games use notifications, and most users like them, so it is not as simple as disabling notifications. This is a great reminder to really dig in on your use of motivational features in your games. One good practice is to turn notifications off by default, but let the user know that they can be turned on (see #3 above). But to me, this is a very difficult choice. Let’s take for example a “Learn Spanish” app. The app has a daily reminder for you to spend time learning Spanish, is that harming the user? I certainly do not think so. You just have to think about this constantly when building your app and think about users who might be subject to being pressured and make sure you are not making life more challenging for your users. I think we can all imagine a 10-year old whose mother would like him to put down a device, and he calls back, “But mom, I need to finish this level!” Add a pause button to your app, and make families happier!
#5 Policies and community standards
Uphold your own published terms, policies and community standards (including but not limited to privacy policies, age restriction,
behaviour rulesand content policies).
This is very straightforward, do what you say you do in your privacy policy and in your in app dialogs. Very straightforward.
#6 Default settings
Settings must be ‘high privacy’ by default (unless you can demonstrate a compelling reason for a different default setting, taking account of the best interests of the child).
This is a great idea, and I’m sure most apps are currently built with ‘Low privacy’ by default. This paired with #3 can really improve the users trust level with your application. Build the app so that it does not save or share any of their information, and then at the first point of use, where you would need to save their data or share their information with your services or other users, show the dialog and ask them to consent to it. Another less obtrusive strategy is to show that dialog when the app first starts or during user registration. Provide them with the ability to make the decision before you share their information.
#7 Data minimisation
Collect and retain only the minimum amount of personal data you need to provide the elements of your service in which a child is actively and knowingly engaged. Give children separate choices over which elements they wish to activate.
I thought this was “don’t collect what you do not need.” A typical example of this is asking for a users full name, when you never use it in the services you provide. Don’t ask for what you don’t need, since that is just one more piece of information that you store, that you need to protect. They extended this further, to include information that I think is covered by #6 above, such as, not keeping history of searches without asking to keep them. But even in that example, if you use search history in your app, you might decide to only keep six months of history, instead of tracking it all. With each piece of information you store, ask yourself if you need it.
#8 Data sharing
Do not disclose children’s data unless you can demonstrate a compelling reason to do so, taking account of the best interests of the child.
This is mostly targeted at 3rd party integrations, and I think the first one that comes to mind is Ad Services. Unfortunately, this is the main source of revenue for free apps and so it makes this more difficult to comply with if you are building apps that are supported by ad revenue. Having an alternative version of your app, that has no ads, but is not free is a common strategy for this, but seems like it needs to be paired with #7 above. To do so, you would offer a free version of your app, potentially with ads that has no data capture, and when the user tries to use a feature that requires the users data to be persisted, the app could offer to do so in the paid version. But this is just one strategy, clearly this is something that is difficult to work with when you are supported by ad revenue. They do give some non-ad concrete examples, such as:
Data sharing can be done routinely (for example the provider of an educational app routinely sharing data with the child’s school) or in response to a
one off or emergency situation (for example sharing a child’s personal data with the police for safeguarding reasons).
And these are much easier to build towards. These examples make it clear that we are talking about sharing the users data with 3rd parties, and not just ad services.
#9 Geolocation
Switch geolocation options off by default (unless you can demonstrate a compelling reason for geolocation, taking account of the best interests of the child), and provide an obvious sign for children when location tracking is active. Options which make a child’s location visible to others must default back to off at the end of each session.
This is a pretty narrow use case, and I think most applications would not be
If you do use geo-
#10 Parental controls
If you provide parental controls, give the child age appropriate information about this. If your online service allows a parent or carer to monitor their child’s online activity or track their location, provide an obvious sign to the child when they are being monitored.
You should let your users know when you are sending information about them to someone else. This means, for instance, when a user is being measured, showing that measurement to their parent should include information to the user/child that this information was shared. I think this would be new to most applications, especially learning applications. Although we usually strive to ensure that the student and the parent/teacher have consistent information, it is not always the case that we tell the user what information is being shared to the teacher. According to the ICO:
Children who are subject to persistent parental monitoring may have a diminished sense of their own private space which may affect the development of their sense of their own identity. This is particularly the case as the child matures and their expectation of privacy increases.
That really makes it clear as to why you would want to ensure that they understand what information is shared, and what is not. I worked recently on an app that had a built-in assessment, so some choices in the game
#11 Profiling
Switch options which use profiling off by default (unless you can demonstrate a compelling reason for profiling, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).
Again, Ads are a problem, but in this case that is not the worst case. They include examples of building a user profile that include subtly driving a user towards content that is not good for the users health. This is great advice for all apps, please help humanity survive click-bait, by not showing it at all, or if you do, don’t track what kind of click-bait the user is clicking on.
If your app uses an algorithm to profile the user or display other users’ data, always be mindful that this algorithm can be gamed, and if there is public content being shared, users can drive traffic to their content by gaming the profile builder. Just something to be aware of as you architect your application. If you are building the algorithms, you should also build in reports so that you can see trends in order to be able to fix problems as they arise.
#12 Nudge techniques
Do not use nudge techniques to lead or encourage children to provide unnecessary personal data, weaken or turn off their privacy protections, or extend their use.
You should not use nudges to nudge people towards making changes to their privacy settings or to use features that would do so. That seems clear and reasonable. However, they do also mention this:
Reward loops or positive reinforcement techniques (such as likes and streaks) can also nudge or encourage users to stay actively engaged with a service, allowing the online service to collect more personal data.
I interpret this to mean that using nudges is not bad, but don’t do it for bad reasons, and again for products supported with ad revenue, this can be a difficult choice. What they are not saying is, don’t use streaks and likes to encourage activity, just don’t use it to encourage an activity that is designed to harm your users.
#13 Connected toys and devices
If you provide a connected toy or device ensure you include effective tools to enable compliance with this code.
This means you should have the same parental and user privacy settings in an app to enable the user of your devices to make smart decisions about how their data is used. This is pretty clear, but not necessarily easy to do. Many of the devices that fall in to this category might not be very functional if you have to first go to a website to approve the data sharing. Some devices that fall in to this category do not have robust user interfaces and so they really require a registration and data privacy set up prior to the first use.
#14 Online tools
Provide prominent and accessible tools to help children exercise their data protection rights and report concerns.
This is easy to do, but implies that you have tools for all the various things that users have rights to do, and you may not. For example, “delete all my personal data” tool. It’s a great idea, but if you did not build with it in mind, it can be challenging to provide a user with that, and scary to provide that to a young user who may click it in error.
#15 Data protection impact assessments
Undertake a DPIA specifically to assess and mitigate risks to children who are likely to access your service, taking into account differing ages, capacities and development needs. Ensure that your DPIA builds in compliance with this code.
If you are trying to comply with GDPR, and you are not sure what DPIA is, here is a handy guide. This item is quite complex, and if you are in this group, you really need to dig in.
#16 Governance and accountability
Ensure you have policies and procedures in place which demonstrate how you comply with data protection obligations, including data protection training for all staff involved in the design and development of online services likely to be accessed by children. Ensure that your policies, procedures,
and terms of service demonstrate compliance with the provisions of this code.
In my opinion, this is the kind of item that makes people want to stop making free software for children, because it can seem like having the applications can put you at risk. I’m not saying that is not true, it is true, there is risk. But I hope, if you follow along with the above guidance, you have already created an app that is awesome, and this is just a reminder to document your process as you go, and to share that with your users.
Conclusion
This is quite a lot to take in, but most of it is great guidance for building amazing experiences for our most important people! Before planning a new app, make sure to review the items and then build with them in mind, it is much easier to build up with this, than to try to retrofit it. User’s data is very important to the success of our application and to our users, so we have to take it seriously and build great apps.