By B.N. Frank
Data collection on kids must be big business, otherwise tech companies wouldn’t be fighting to continue the common yet creepy practice. Of course, it’s happening in schools too. Last month Common Sense Media warned that most K-12 Schools Virtual Reality (VR) devices are collecting and selling data from students’ use, so this new report should come as no surprise.
Report: 96% of K-12 Apps Share Student Data With Third Parties
According to the nonprofit Internet Safety Labs, most ed-tech software tools share student data with third parties, in many cases without user consent, and schools should treat data privacy as an enterprise IT problem.
Given the recent influx of ed-tech tools into the classroom, many academic leaders are questioning the privacy of student data, with some wondering if rules need an upgrade. A nonprofit organization that conducts independent product safety testing decided to conduct its own research on the privacy question, and it found potential cause for concern: A vast majority of apps used in schools share student information with third parties.
In Part 1 of its K-12 EdTech Safety Benchmark: National Findings report, the nonprofit Internet Safety Labs (ISL) estimated the figure at 96 percent. One of the report’s authors, ISL Executive Director Lisa LeVasseur, said in an interview with Government Technology that the organization conducted preliminary research a year ago and saw some red flags, which led to the formal research on student safety commencing earlier this year.
“I actually kind of think I wanted to be wrong about that earlier research,” she said.
The nine-person ISL team, funded by the Internet Society Foundation, looked into 13 schools in each state as well as Washington, D.C., viewing information from 663 schools in all — 12 public and one private per state, in urban and rural areas, and evenly divided by grade, covering an estimated 455,882 students.
The study found the schools collectively recommended 1,722 different apps, and ISL tested 1,357 of those. All told, the researchers collected 88,000 data points on the apps and more than 29,000 data points on the schools, the release said.
“There was a dynamic there, where it looked like educators felt more was better,” LeVasseur said. “I don’t think that has much to do with COVID. I think that has to do with, just a propensity of ‘more is better,’ like if we give the kids more stuff, that’s better for them.”
The ISL report said that while schools are trying to be helpful by providing more technology — an average of 125 technologies per school, or an average of 172 among schools that had some kind of vetting process — more is not better, given the poor scores of the apps in their research. Of the 96 percent of the apps that share data with third parties, 78 percent of the time it was with advertising and data analytics entities, and often without user consent, the release said. LeVasseur said most of the time, it was when schools used custom apps.
“These community engagement platforms, these white-labeled, like school district, customized apps — usually from Apptology, Blackboard and SchoolinfoApp — there’s a whole bunch of these companies that make rebrandable platform mobile apps, and schools buy them,” she said. “Those were among the least safe apps. We saw those things were sharing to a lot of ad tech. That is not how it should be. If there’s one category that should be safer than any of them, it’s probably that one, and that was a disturbing finding.”
From its research, ISL found that 28 percent of apps were not education-specific, like YouTube and news sites, and had essentially no limits for children. The study found that 23 percent of school apps create potential risk of sharing kids’ data with ad agencies, with no knowledge of where it is sent. It also found that Google, a company that dominates the K-12 ed-tech space with tools such as Google Classroom, Docs and Sheets, sends data to its advertising platform from about 68 percent of the apps. LeVasseur said schools need to commit resources to the vetting process.
“Schools really need enterprise IT. This is for real enterprise IT, both cybersecurity and privacy. And I don’t know how well schools and districts are set up … [for] technology vetting,” she said. “So I would say that you really need to start pushing hard on those vendors, and get more information about what kind of data sharing those apps are doing, what kind of advertising is in those apps. It’s really about more supply chain management that maybe schools are not well equipped to navigate [that].”
LeVasseur said that the expectation is that ISL will release benchmark reports every several years, with this initial one serving as a baseline. This is the first of four reports, with future installments looking to address regulations, certifications and best practices, among other topics, LeVasseur said. The hope is that schools will take the proper steps in vetting, she said, while also advising that ISL offers safety testing on all tech products.
Giovanni Albanese Jr. is a staff writer for the Center for Digital Education. He has covered business, politics, breaking news and professional soccer over his more than 15-year reporting career. He has a bachelor’s degree in journalism from Salem State University in Massachusetts.
It’s worth mentioning that for years American tech insiders (aka “Silicon Valley parents”) have been sending their kids to low-tech and no-tech schools, making their nannies sign “no screens” contracts, and spying on their nannies to make sure they aren’t breaking those contracts (see 1, 2, 3, 4). Of course, research continues to prove that kids’ use of smartphones and other digital devices isn’t good for their behavioral, emotional, mental, and physical health (see 1, 2, 3, 4). Yet American public schools continue to push tech. Doesn’t make sense, does it?
Activist Post reports regularly about privacy invasive and unsafe technologies. For more information, visit our archives and the following websites:
Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.