The influence of emotional artificial intelligence and digital toys on child development

Posted on Posted in Children's Rights, Education, Explotation

Children’s play-spaces are increasingly permeated by modern technologies. As the world transitions into the digital age, children are more frequently exposed to digital toys. In particular, recent years have seen the rise of emotional artificial intelligence (AI) toys; items which collect data from children and adapt their functionality accordingly. Though this development is not without its benefits – positive use cases exist where toys have been programmed to identify developing mental health issues, condition positive behaviour and improve online learning (McStay, 2020) – there are overwhelming concerns.

What is emotional artificial intelligence?

Emotional AI is most commonly understood as the “use of affective computing and AI techniques to sense and ‘feel-into’ human emotional life” (McStay & Miyashita, 2020). These advanced learning technologies use biometric sensing to read and react to children’s emotional reactions, collecting audio and visual data in the process (McStay & Miyashita, 2020). The trend to monitor, and react to, child behaviour mirrors other developments across adult technological devices. While these toys are not yet widespread, there is historical precedent to suggest that they will soon become the norm. 

Toys have slowly been evolving technologically: circuit boards in the late 1900s became internet-connected toys in the 2000s, after which image/voice recognition software was input into toys in the late 2010s (McStay & Miyashita, 2020). The current iteration of emotional AI toys thus builds on the storied intrusion of technology into the lives of children. These encroaches threaten to undermine children’s right to privacy and, perhaps most concerningly, their right to development of their full potential under Article 29 of the United Nations Convention on the Rights of the Child (CRC) (UNCRC, Article 29). As children are at an earlier stage of their emotional maturity, they are particularly vulnerable to the influence of technologies (McStay & Miyashita, 2020). 

Experts in child development and educational technologies have cited the serious potential harms that are birthed from toys designed to manipulate children’s emotions (McStay, 2020). The use of emotion-detection technologies can affect a child’s development, and the storing of child data requires greater regulation (McStay, 2020). Though countries will typically enforce data protection and privacy laws, these are overwhelmingly focused on adults (McStay, 2020). In the spirit of the Convention on the Rights of the Child (CRC), there is a great need to protect children’s data and right to play, to ensure that they are not manipulated for financial gain. 

Emotional AI toys in practice

Emotional AI toys typically manifest as small handheld devices, often presented as ‘buddies’ for children. Leading social robotics companies Jibo and Anki have pioneered much of the development in this space (Yao, 2017). Jibo (the company’s breakthrough toy of the same name) is described as possessing “a highly interactive and empathetic presence” (Yao, 2017). In fact, this description reflects the toy’s ability to recognise children by their faces and learn their interests, before interacting with them on this basis (Yao, 2017). The toy is capable of identifying what a child likes and displaying images or news stories which correlate with this interest (Yao, 2017).

Similarly, Anki’s most recent creation ‘Cozmo’ is an affordable robot “buddy” which pushes the boundary further. Cozmo is capable of learning, adapting and responding to children, but also presents a “mood” of its own (McStay, 2021). Mimicking human behaviour, the robot is capable of showing elation and confidence, as well as sadness and anger when a situation provokes that response (McStay, 2021). Learning from its user through cameras, the toy is able to learn faces and names, as well as respond to basic emotions with its own ‘human’ reactions (McStay, 2021). 

History shows us that toys of this kind, such as the Tamgotchi – a popular digital pet toy from the 1990s – can create dependency issues for children, who grow to feel responsible for the robot’s well-being (Jolin, 2017). Without regulation, the dangers of such attachments will only grow with technology. 

The commercialisation of children’s right to play

As it becomes more difficult for people to secure their personal information, children are at greater risk of data infiltration. Companies are collecting children’s data and sharing it with third parties and other corporate actors, to implement future marketing strategies and product design (McStay, 2020). As shown by a long list of media reports, poor data security measures have even enabled datasets to be frequently breached and redistributed (McStay, 2020). 

As more data on children’s emotional development is collected and distributed, it becomes difficult for families and children to trace their information and protect it from abuse (McStay, 2020). This data collection carries greater risks, providing commercial enterprises with vital information to inform more manipulative technologies in the future. 

This practice creates an imbalance of power. Children are effectively utilised as a form of ‘free labour’ for corporations and companies looking to exploit their vulnerability for profit (McStay, 2020). By profiling children’s development, these toy companies violate a child’s inherent right to privacy and expose them to the possibility that this sensitive information would follow them into adulthood (McStay, 2020). Childhood is a challenging and vulnerable period of a person’s growth which can impede a successful development into a healthy adult if a child is left behind during that early period. 

Policy implications

Emotional AI poses numerous complex questions for policy makers across the globe. If parents do not fully understand these toys, should children be playing with them? Furthermore, is it ever fair to allow children to use these toys – and have their data collected – if they are unable to make an informed decision?

The manipulative roots of this technology, predating parents’ inherent desire to ensure their child’s happiness and children’s innate vulnerability, require greater attention (McStay, 2020). Formal regulations and stricter legislative governance are required. Current legislation typically fails to adequately protect children in this regard. To remedy these policy gaps, experts have proposed a few methodologies.

Policymakers could seek to identify emotion data as a specific category warranting unique protection, to guarantee children are adequately safeguarded (McStay, 2020). This protection could be provided in the form of a ban on the use of emotion data to inform commercial marketing and product design (McStay & Miyashita, 2020). The pervasiveness of emotional data could also trigger an expansion of the right to erasure: children’s right to request that their personal data is deleted once they turn 18 (McStay, 2020). All of these approaches must be formulated in the spirit of the CRC and fall in line with international standards.

Recommendations on potential solutions

Technology will continue to grow into children’s lives. Hello Barbie and Amazon Echo are just a few examples of the incoming new generation of smart toys which are always on and monitoring child behaviour to gather data (McReynolds, 2017). While these and other toys are marketed as educational, advertising campaigns fail to inform purchasers on the potential impacts of these toys. At the commercial level, corporations, NGOs and legislators must pull together to:

  • Reevaluate the types of data collected from children and the necessity of holding this data on record (McReynolds, 2017). Corporations could look to ensure that data is deleted after a fixed time period, and establish more acute procedures to define which data is kept; 
  • Mandate that corporations are forced to run awareness raising campaigns to ensure the public is sensitised on the risks of emotional AI toys to a child’s privacy and development (McReynolds, 2017);
  • Enforce stricter child privacy protections to ensure toy designers can be audited, certified and held accountable for their products (McReynolds, 2017). 

There are also actions that can be undertaken at home to better protect children. Parents and guardians can:

  • Resist from offloading parental time and tasks onto emotional AI toys which, by their nature, oversimplify human interactions and may hamper child development (McStay, 2020). Children should be encouraged to use their imagination as much as possible and play a crucial role as active.  
  • Conduct research into the workings of emotional AI toys and their potential effects. Parents should pay particular attention to the types of data collected by these toys and the ways in which this information is kept and maintained. 
  • Ensure, as far as is possible, that children’s data is deleted periodically, to protect their privacy. 

At Humanium, we seek to raise awareness of the importance of children’s rights. Our work is possible thanks to your support! Make a donation to help us making children’s rights a reality.

Written by Vanessa Cezarita Cordeiro

References: 

Akundi, S. (2020, March 2). ‘Gauging attachment when your child’s best friend is a robot.’ 

Jolin, D. (2017, September 10). ‘Would you want a robot to be your child’s best friend?’ 

McReynolds, E. Hubbard, S. Lau, T. Saraf, A. Cakmak, M. Roesner, F. (2017, May 6). ‘Toys that listen: A study of parents, children and Internet-connected toys.’ 

McStay, A. Rosner, G. (2020). ‘Emotional AI and children. Ethics, Parents, Governance.’ 

McStay, A. Rosner, G. (2021, March 15). ‘Emotional artificial intelligence in children’s toys and devices: ethics, governance and practical remedies.’ 

Prof. McStay, A. Prof, Miyashita, H. Dr. Rosner, G. Dr. Urquhart, L. (2020, November 11). ‘Comment on Children’s Rights in Relation to Emotional AI and the Digital Environment’. 

Terry, Q. (2018, November 15). “How AI-enabled toys are molding our children.” 

Turkle, S. (2017, December 7). ‘Why these friendly robots can’t be good friends to our kids.’ 

Yao, M. (2017, April 6). ‘Should your child’s best friend be a robot?’