top of page
  • Writer's pictureGanna Pogrebna

Behavioural Data Science and Protecting Children Online

The use of educational technology (EdTech) has seen a significant rise over the past few years, but with it comes some concerning data privacy risks. In fact, a recent investigation has revealed that 89% of EdTech products used at schools around Australia may have put children's privacy at risk by tracking their online activity and sharing their data with advertisers. Behavioural data science is a powerful tool, which can help predict, diagnose, and alleviate privacy risks for kids and adults alike.

Humans and algorithms interact on a daily basis and "algorithmic manipulation" becomes an important problem: algorithms drive human decision-making in many areas including the choice of articles to read; the choice of social media posts to engage with; even loan decisions are often made based on "decision support" recommendations from the algorithms. In that regard, children are particularly vulnerable to algorithmic manipulation as they cannot give informed consent to how their information is used and often rely on the choices made by their schools or parents. Many EdTech tools track children's online activities and pass them on to third parties without their knowledge, leaving them open to unsolicited advertisements and giving away more of their valuable data to unknown parties.

Under these circumstances, behavioural data science approach could be applied to help protect children from EdTech privacy risks. Specifically, through behavioural data science tools (by considering human, algorithmic, and systems factors), 4 large risk groupings can be identified:

  • Lack of informed consent and 'no privacy' defaults: collecting and sharing children's data without their informed consent raises concerns. Children don't have the experience to appreciate the risks associated with their data being shared with advertisers, making it challenging for them to make informed decisions about their privacy. As a result, children may become used to 'no privacy' defaults and may not value their privacy as much as older generations by the time they become adults.

  • Lack of choice 'not to engage' with technology: children don't have a choice when it comes to EdTech tools, as their schools impose them on them. The tools track their online activities and pass the data to third parties without their knowledge. Children are then subjected to unsolicited advertisements that encourage them to buy things they don't need and share more personal information without much thought.

  • Lack of ability to stop and reflect: children, like adults, fall into the recursive pattern of clicking on things offered to them by various algorithms or accepting algorithmic manipulation without understanding how the algorithms work. This issue is far more severe for children, who are often unaware of the substantial impact machines have on their day-to-day behaviour.

  • Adverse cybersecurity implications - the privacy paradox: when people are asked about their attitude towards privacy, they tend to say that they care about and value their privacy. However, when faced with trading some of that privacy for receiving a digital service, they often sacrifice it without much reflection. The same is true for children, who are more vulnerable to algorithmic decision-making manipulation than adults.

Tighter regulation of how children's data is collected, processed, and used by EdTech is overdue. However, regulation alone is not enough, and individuals must unlearn the habit of blindly trusting technology and train themselves and their children to reflect on the various inputs that digital technology and algorithms offer as inputs into their decision-making process. It is only through regaining this ability to stop and reflect that we will ever be able to regain our independence as human decision-makers. In that regard, behavioural data science provides valuable insights into building privacy-preserving systems and tools, which, on the one hand, understand how algorithms are making decisions and, on the other hand, adjust algorithmic thinking to work for the benefit of people.

Use Case

EdTech Privacy Barometer

EdTech Privacy Barometer is an ongoing project, which aims to develop a tool (app) for schools and parents to better understand how data is used by various EdTech platforms and applications.

Selected References

Education Today (2022) Edtech Data - How to Protect Kids


bottom of page