Foundation series

Privacy Foundation report highlights vulnerability of children’s data to big tech

“They are now very integrated into the school systems,” said Auckland sociologist Dr Caroline Keen.

She had tried unsuccessfully for two years to get the authorities to take stock of the state of tech in schools, and to regulate.

But the department had not carried out any privacy impact assessment or analysis, the OIAs showed.

The Privacy Foundation report said this risks creating “a surveillance environment that records children’s activities and retains their data for future and unknown use” and an “educational environment in which children are forced to use privacy-invading software.

The report’s author, Dr Marcin Betkier, of the University of Victoria, said a child’s personal data at school should not be used outside of the educational context.

Yet their initial analysis showed that there was no guarantee in the company’s terms and conditions or contracts that it would not be used for commercial purposes.

They had questioned the ministry, the privacy commissioner, Netsafe and the association of school administrators about it and obtained limited information about the contracts.

“Nobody really checks that the software used by schools in New Zealand creates this safe educational environment,” Betkier said.

An overseas survey in May found near-constant data mining and tracking of students from primary school age to university – and often few checks on the data.

The NGO Human Rights Watch (HRW) has identified 145 edtech products making children’s personal data available to 196 third parties, mostly ad tech companies, in 49 countries (not including New Zealand).

“These products monitored or had the ability to monitor children, in most cases in secret and without the consent of the children or their parents, in many cases by collecting data.”


Australia was covered by HRW and the ABC media group carried out their own checks on the methodology.

“The extracted data is analyzed by for-profit interests,” Keen said.

“Then it’s used to direct a child to certain courses or careers or generate a consumer profile that can then be sold to data brokers, and…we don’t really know how long that’s used and how.”

In New Zealand, calls for equity have propelled the distribution of devices, particularly in poorer schools, rapidly increasing online access but also eroding the ability to choose not to learn online.

Many companies in the wider market feed personal data to sophisticated algorithms to predict people’s behavior.

Multiple overseas studies show that educational “big data” has become widely used to predict student academic outcomes and to “advance education reform.”

Keen said his research showed New Zealanders were jaded about it.

In contrast, the US state of New Mexico sued Google – and lost – in 2020, with its lawsuit stating, “Children are being watched by one of the world’s largest data mining companies, at the school”.

The United States Federal Trade Commission recently said it would crack down on any company if it illegally surveils children.

The World Economic Forum said edtech is radically changing education, to the benefit of students, in a global market that is expected to exceed $500 billion by 2025.

Unicef ​​is seeking more buy-in from data companies by pledging, for example, not to economically exploit children.

An early attempt backed by Bill Gates to amass data on US college students, InBloom, was shut down in 2014 over fears of overreach.

Research by Keen in 2020 showed that parents and students were “largely unaware of the data captured or generated about them” at school.

“Furthermore, they often believe that there are regulations to prevent the collection and misuse of personal data.”

The Privacy Foundation, relying on a series of responses from the OIA, said children and parents appeared to have little choice to opt out, so their consent to edtech lacked meaning.

“Data practices should not be justified by questionable consent from individuals or even institutions that cannot modify the operation of the software and can only adhere to the terms and conditions offered.”

Schools and kura lacked the “power and expertise” to choose privacy avenues, he said.

“It is often unclear which laws apply to contracts between schools and kura and overseas online platforms, and how their potential engagements in these markets apply in New Zealand.”

Privacy law applied but did not contain specifics about online learning, critics said.

The privacy commissioner told RNZ he was “reviewing the issues…raised by the growing prevalence of e-learning”.

The commissioner didn’t provide any links to any reports he has and didn’t say if he was aware of any privacy impact assessments done by anyone on the matter.

RNZ contacted Google and Microsoft for comment, but they both declined because they had not seen the foundation’s report.

What the Ministry of Education said about data security for children

The Ministry of Education said it was working to improve privacy protections for online learning.

At the end of 2021, he joined Australia’s Safer Technologies for Schools program which assesses product security and privacy controls.

In a statement, it said it was working with them to develop the framework and assessment process to meet New Zealand’s requirements, and a first cross-Tasman version should be available later in 2022.

The ministry told RNZ that it provides Google and Microsoft products to schools free of charge, paying the software license fee itself.

“We have reviewed Google and Microsoft against our internal privacy and sourcing policies,” Chief Digital Officer Stuart Wakefield said in a statement.

He declined an interview.

The companies both hold ISO certifications for privacy and have signed the U.S. Student Privacy Pledge, “a voluntary but legally binding industry commitment to protect student privacy regarding the collection, maintenance, and use of personal information. students”.

“We rely on these independently audited certifications,” Wakefield said.

The ministry hasn’t conducted privacy and impact assessments — schools could, he said.

“We are monitoring these tools and tracking potential issues.

“We are aware that the products that schools choose to use may have non-existent or insufficient privacy policies on their websites or may use outdated software, for example Flash, or include third-party advertisements.

“MOE’s privacy, data and digital teams continue to monitor, report and create policies as needed.”

Suppliers also had to comply with the Privacy Act 2020, the Health Information Privacy Code 2020 and other codes.