When it comes to the “data and privacy issues around EdTech,” I am regularly asked: “So what? Why does this matter? Why should I care?”
Here’s an FAQ that hopefully answers these questions and more:
Q: How are children using “EdTech” tools and products at school?
A: “EdTech” or “educational technology” can mean anything from 1:1 district-issued Chromebooks, online curricula, games or apps used for “learning,” “learning management systems” (like Schoology), online grading portals, digital surveillance or “safety” tools (like Securly or GoGuardian), online mental health assessments, online state testing, and more. Broadly speaking, the digital products children use at or for school are owned by any number of technology companies, not the schools themselves. It’s also not just a few platforms or tools; it’s hundreds. Per school. There may be a wide range in how different schools, grades, subjects, and teachers will use such tools. Some teachers use it at the bare minimum; other classes are using them every period, every day.
Q: What should I worry about as a parent when it comes to EdTech tools?
A: This is deserving of its own essay, but there are (at least) two big concerns:
First, such tools rarely, if ever, are better than an analog or teacher-based approach when it comes to efficacy. See the UNESCO GEM Report for more on that. It is also difficult to find independently-funded research on EdTech– most is funded by the companies who benefit from the use of such products. Internet Safety Labs is one good source for independent research on EdTech platforms and tools.
But secondly, and arguably more importantly, even if these products and tools were “beneficial” to learning and teaching (and how to measure “benefit” and “success” is in and of itself a debate), nearly all these EdTech companies collect data about the children using the products and sell it to third parties for profit. In fact, Internet Safety Labs found that 96% of EdTech tools share children’s personal information with third parties.
Q: Wait. Why would an EdTech company want to collect my child’s data?
A: In the digital age, data is the currency. This is how “free” social media apps work– we aren’t paying to use the service, we are giving Meta and Snap information about us as consumers that they collect and repackage to target us with advertisements (you’ve probably noticed how an innocent Amazon search on your phone ends up in similarly-themed ads in your Instagram feed.)
Something similar is going on with EdTech products. While they may not be monetizing data in the exact same way as Meta and Snap do, they are still monetizing children’s data somehow. Because the apps, digital curricula, and “learning management systems” used by children at or for school are owned by technology companies, “time on device” is the measure by which such companies determine product success. These are private companies with shareholders who want to see profits. The profits come from licensing agreements with school districts (often in the millions of dollars) and from the data they collect about their customers– in this case, our children.
And children’s data is extremely valuable. The United States Department of Education found that one student data record is worth between $250 and $350 on the black market. There are nearly 50 million school-aged children in the United States, which is why the EdTech industry is a huge and lucrative market, projected to be worth over $800 billion in the next decade (aggregate amount of spending over hardware, software, and cloud services globally. (Today it is worth $150 billion.)
Q: What kind of data do they collect?
A: Here is a non-exhaustive list of what EdTech companies may collect about our children:
Enrollment data
Course schedule data
Student identifiers
Academic program membership
Extracurricular program membership
Transcript data
Student-submitted materials within a course
Student grades
Student assessments
Student address
Student email address
Phone numbers
Student name
Student date of birth
Parent or guardian name
Student conduct data
Student behavior data
Student social-emotional learning indicators and inputs
Student evaluation and management data
Physical and mental disabilities
Immunization records
Treatment providers
Allergies
Email delivery preferences
Communications with stakeholders, such as parents
Usage data
Browser data
Device-specific data, including:
Data security
Cloud hosting
Information technology
Customer support
Usage and analytics
Application performance monitoring
User identity
User authentication
Feature integration with third-party products or companies
Q: But isn’t all of this anonymous?
A: Unfortunately, no. Schools may be told by EdTech companies that the data is “anonymized,” but given how much data is collected, it is not difficult to reverse-engineer data to identify students individually. There are even companies called “identity-resolution companies” whose job is to collect information about users, re-identify the data, and sell it.
Q: I know I signed a User Agreement or agreed to some Terms and Conditions when the school gave my child a laptop. Doesn’t that mean I consented to all of this?
A: No. It is extremely unlikely you were fully informed of all these potential risks or about the ways in which these companies would use your child’s data. Most schools are not even fully informed of the risks. EdTech companies would like you to think that you have consented; but legally-speaking, this is not “informed consent” and would not hold up in a courtroom.
If you are concerned, please contact Andrew and Julie Liddell of the EdTech Law Center. They are a consumer protection firm that will speak to you free of charge to help you understand your rights.
Q: What are some consequences of these companies having my child’s data? How do they use this data in ways that might harm children?
A: Aside from the fact that a vast trove of your child’s personal information is now on the internet there are numerous ways this can harm a child.
Children using school-issued devices to do web searches about controversial topics, even if simply researching them for a school assignment, can flag a student as interested in extremist behavior, conspiracy theories, and more.
The information students search, send, or type on digital platforms can later be used against them in college admissions.
Law enforcement agencies have used student data to flag “at-risk” children. There are huge implications for police departments having access to this type of data. As one example, law enforcement agencies can use student data behavioral records to predict future criminal behavior.
The data collected about children can also be used to deliver targeted ads that can contain dangerous messaging about self-harm, health misinformation, or extremist behaviors.
Data breaches are also common. Recently, PowerSchool, Inc., a $5 billion EdTech company (owned by Bain Capital), was the recent target of a massive data breach in which over forty years of data on over 70 million people were leaked.
EdTech companies also market their “cradle to career” unified data systems as tools for state and local governments to use to support workforce planning and juvenile justice and law enforcement.
Q: So what can parents do about this?
A: The first thing parents can do is understand that this is a problem. Many schools and parents are just now coming around to the fact that student personal devices (like phones and smartwatches) should not be available to students during class time. This is absolutely the right thing to do in terms of what is best for children and learning.
However, if all personal phones and watches are banned, but schools still provide iPads or Chromebooks to students, students are still engaging with harmful, manipulative, and addictive tools and platforms whose business model is entirely dependent on “time on device” and “user engagement.” Children are clever– they may no longer have their smartphones, but they can still chat within Google docs, watch TikTok videos on Pinterest, download VPNs to bypass school filters, and watch YouTube videos (often unblocked and available to students in the Google Classroom suite of products– Google owns YouTube.)
The second thing parents can do is to share this information with other parents. Print off or email this essay to your parenting peers. There is a clear need for collective action with pushing back on EdTech, and while these tools and products are harmful to learning and teaching, the privacy and data security issues are far more impactful and far less understood.
Q: Can I refuse? Opt out of EdTech?
A: Short answer, yes. Is it always easy to do this? No. Schools will say that it is not possible, that children will fall behind, that they cannot accommodate, etc. Schools saying that it is not allowed or not possible to refuse or opt out are essentially asking families to choose between their child’s education or protecting their child’s privacy and data. This is when you reach out to the EdTech Law Center for support.
Q: How do I opt out?
A: I have created an UnPlug EdTech Toolkit that contains all of the above information, plus a series of letter templates and resources that will support you in this process.
Q: Have you (Emily) opted out of EdTech? How is that going?
A: Yes. My daughter is a 7th grader this year. Last year, she used the school-issued laptop. While she didn’t have a lot of homework on it, she was complaining almost daily of headaches after school. We got her vision checked, got her reading glasses, and the problem persisted. This year, we decided to walk our walk and talk our talk and refused the laptop altogether. Lo and behold, the headaches are almost completely nonexistent. We are fortunate to have the support of an excellent principal. We started the conversation with her the spring before 7th grade started and focused on building a good working relationship with school leadership. When my daughter does need to use a computer for typing an essay, for example, her English teacher has a spare one she can use. We have also opted her out of all standardized tests and online mental health assessments.
This is far from ideal, however. We realize she still has an online presence in the learning management system, her science curriculum is still all-digital and has resulted in her using the loaner computer more than I would like, and we recently learned that the district is advising writing teachers to use AI to assess student papers. Our plan for next year is to continue the opt out and have a more explicit conversation about the online science curriculum.
It should be noted that my daughter is thriving without the computer. She has very good grades, she is an active class participant, and she seems mostly unbothered by the fact that she is the only student without a laptop.
Concluding Thoughts:
I am not anti-technology; I am tech-intentional™. I see a desperate need for technology education in schools (and at home), to teach children skills like typing (shockingly, many students with 1:1s don’t know how to type); media literacy; the ability to identify mis- and dis-information; to understand what AI is and why it is still so risky. But “EdTech” is not “Tech Ed.” What is currently happening in K-12 schools is mostly EdTech (or the “Tech Ed” provided is curricula delivered by the same companies who benefit from kids’ using it, but not understanding it).
We have a lot of work to do, but I believe that the future of schools is tech-intentional, and there are things we can do to get there. In fact, I have an entire talk on this very topic, which I delivered in Louisville, KY, at the Filson Historical Society in March 2025. You can view the talk here.
At the end of the day, I see three very important questions that need to be answered with a resounding “YES” in order for EdTech to continue in schools:
Is it effective?
Is it safe?
Is it legal?
Currently, the answer to all three of these questions is NO.
For more on this topic, also be sure to check out these essays I’ve written: