Data Privacy Brasil participates in UN’s OHCHR briefing on Brazil
The organization highlighted how the advance of edtech has been violating children’s privacy in the country
Data Privacy Brasil Research Association submitted in May a joint contribution with Privacy International and InternetLab to the United Nations’ Human Rights Office (OHCHR) in advance of the third periodic report of Brazil on the implementation of the International Covenant on Civil and Political Rights during the 138th session of the UN Human Rights Committee.
The submission calls attention to violations to children’s privacy and data protection rights in the use of educational technologies. It discusses the state of the art of the use of technologies in Brazilian public basic education; the use of applications that use AI and the associated risks; and the use of facial recognition technologies in schools and their problems. The submission draws from Data Privacy Brasil’s research project AI in the classroom: models of participation for the school community.
On June 26th, Marina Meira, Head of Projects of Data Privacy Brasil, participated in the OHCHR’s formal and informal briefings on Brazil, in order to present to the members of the UN’s Office the human rights violations that have been faced by Brazilian children in the use of edtech. Data Privacy’s oral contribution to the debate was the following:
My statement is in regards to the submission presented by Data Privacy Brasil Research Association jointly with Privacy International and InternetLab. I will highlight two ongoing processes in Brazil that have been posing severe violations to children’s privacy in the educational context, thus violating article 17 of ICCPR.
The first process is the spread of artificial intelligence tools in schools all over the country without any prior risk assessments, safeguards or considerations to the fact that children are going under a developmental stage. AI edtech often relies on the mining of a great amount of personal data and is also often associated with biases that can cause discrimination and exacerbate inequalities.
The second process is how the procurement of educational technologies has been conducted. The State in different federative levels, especially after the start of the pandemic, has been acquiring edtech based on what is most cost effective, without assessing if the resources protect students’ and teachers’ privacy and human rights. And private companies have been benefiting from an undue interpretation of Brazil’s Procurement Law as if they didn’t profit through the intense harvesting of student data.
Given this worrisome scenario, we recommend the UN Human Rights Committee call on Brazil to:
(i) Regulate edtech, establishing strong safeguards to protect children’s privacy and best interests;
(ii) Regulate AI and consider for such regulation that operations that involve children’s data must be given special attention, so that edtech that uses AI is regulated to reduce harms and make systems more transparent and auditable;
(iii) And adhere to formal public procurement processes when awarding a contract to an edtech company, while also providing training to educators and public administrators in data protection to enable them to evaluate the risks and benefits of edtech beyond usability.
The full submission can be accessed here.