Africa’s Ed-Tech Platforms: Protecting Children’s Right to Privacy
Keywords:Ed-Tech Platforms, Privacy, Consent, Data Protection, Education, Children's Rights, Parental Consent
Ed-tech platforms are used to create a more engaging, inclusive, and individualized learning experience. A number of these platforms utilize Artificial Intelligence (AI). AI-enabled learning tools and approaches have revolutionized the global education sector (Pedro et al., 2019). They have been recognized for their contributions to enhancing the quality of learning and teaching. AI aids teachers and students in their lessons (Pedro et al., 2019). Additionally, AI has been lauded for its potential to boost students' knowledge and learning habits, while also creating a more personalized approach to learning (Pedro et al., 2019). Children are less likely to read or understand privacy policies, and they may have a limited understanding of their right to privacy and data protection. In addition, they are more susceptible to marketing techniques that adults can identify. The existence of their personal information online poses potential safety and security risks. Therefore, privacy policies on Ed-tech platforms must incorporate children's rights and an understanding of their right to privacy. This includes online protection and security measures established to protect children's data. The primary audience for this policy brief is corporations. Although, children’s rights to privacy are a collective responsibility of the parents, legal guardians, and other individuals legally responsible for the child. In the Ed-tech space, this responsibility extends to AI platform owners, who manage the platforms, and policymakers and regulators concerned with data protection and children’s rights.
How to Cite
Copyright (c) 2022 Rachel Achieng’, Emmah Wakoli, Michelle Rodrot
This work is licensed under a Creative Commons Attribution 4.0 International License.