Four Fallacies of EdTech
Four Fallacies of EdTech
Friction Required
Friction Required
Jul 9, 2025
Jul 9, 2025
Updated: Jul 9, 2025
Updated: Jul 9, 2025
0 reads
0 reads



Learning should be hard. That's the whole point. Friction and struggle lead to insight and knowledge and humans best persevere through moments of struggle in the context of trusted human relationships.
But when AI-infused EdTech tools try to remove friction, they are also removing learning. Such products are fundamentally at odds with educational goals.
Some will argue that the American public education system exists to indoctrinate a future workforce and the purpose of education is only to turn out future workers. I disagree and I’m not going to debate that here; if that’s what you believe, then this essay isn’t for you.
But if you believe like I do that the purpose of education– and public education in particular– is to prepare children to become thinking, feeling members of a society for whatever future they will inherit, and which is achieved by learning both about their past while preparing for the skills of the future (the skills children have always needed to be successful, like empathy, critical thinking, and communication), then this essay is for you.
Many EdTech companies are pitching “AI solutions” to address the challenges facing schools and educators– challenges like oversized classrooms, overworked teachers, overwhelmed parents, and overstretched administrators. None of these challenges are new, of course, but when industry has a product to market, as they do with AI tools, they look for a vulnerable audience, and resource-strapped schools are an easy target. (The “low cost” of AI tools for schools comes in the form of “high costs” to student data and privacy– a devil’s bargain.)
As one example, last year PowerSchool, the largest EdTech provider in North America and parent company of platforms like Schoology and Naviance, with a net worth of over $5 Billion, introduced PowerBuddy: an AI bot that “delivers personalized insights, fosters engagement, and creates a supportive environment for everyone at every step in their educational journey.”
PowerSchool makes several sweeping claims about what PowerBuddy can do.
Imagine Each Student has a Digital Helper With PowerBuddy. Students have a personalized conversational AI guide to assist them as they learn!
Now Every Teacher has the Perfect Assistant. Giving teachers time back, PowerBuddy helps them quickly develop personalized and effective lesson plans and assessments.
Parents Can Get Involved and Stay Involved. With PowerBuddy, families have a reliable connection with their child's teacher and school.
Every Administrator has Super Powers. Administrators can depend on PowerBuddy to access, interpret, and share key information with stakeholders, ensuring transparency and empowering collaboration.
Every Student is Ready for College, Career, and Life. PowerBuddy makes career exploration and post-secondary planning simple, personalized, and accessible for more students.
IT and Data Analysts Can Work Faster. Leverage the power of conversational AI for faster data mining and statistical analysis. With PowerBuddy, you can essentially talk to your data!
I’ll be the first to stand up and say that schools, administrators, teachers, and parents need more support.
But the support they and we need does not look like this.
Let me start with the obvious: the business structure of EdTech and BigTech is the same, their leadership models are the same, and their profit margins are the same– enormous. In other words, EdTech is just Big Tech in a sweater vest. Just because PowerSchool and its brethren work with schools and claim to be in the business of “transforming” education, their cloaked altruism is misleading, dystopic, and self-serving. (This has always troubled me: Why do technologists get to be the experts of teaching and learning? Isn’t that what educators are for?)
There is a role for technology in the classroom– a “tech-intentional” one, in fact. Students need to learn to type, research, and identify sources of information. They need to understand how computers, the internet, and AI products work. Some need higher math modeling tools or to utilize video-based content to illustrate abstract concepts. But the problem with the current model is the presumption that because we can and do have the technological capabilities therefore all learning can or should take place digitally. This is, of course, nonsense. High-level math modeling for high school seniors is not the same as teaching children in kindergarten how to read via a digital app. We don’t need to give Kindergartners tablets to ensure they know how to use technology safely or purposefully when they graduate from HIgh School. This isn’t hard– we already do it with how we teach teenagers about substance abuse without providing them access to drugs or about safe sex and consent without requiring them to engage in it.
Similarly, TechEd is not EdTech.
We can teach children to type on non-internet computers (or even just keyboards or typewriters, for that matter). We can educate students about quality research skills, assessing studies for bias or problematic funding, and how to create citations without providing them the internet to do so. (How will children know a citation is a bad or false one if they haven’t been taught what it means to generate a factual or accurate one first? How can they discern “truth” if they lack the critical thinking skills to know how to differentiate between fact and fiction?)
Arguing that children “need” internet-connected digital technologies like AI to “be prepared” for the future is backwards and inconsistent with both child development and how humans learn. We don’t teach babies to walk before they crawl; we don’t teach children to read before they know their letters; we don’t expect engineers to code without also knowing how to communicate their ideas or hold a conversation. Some skills, such as critical thinking or empathy, cannot be taught via a screen; these skills must come before we give children access to technology. I beat this drum over and over for a reason: Skills before screens. Learning happens in the context of relationships.
Redefining and prioritizing what skills need to be taught before screens are introduced in schools is definitely a topic worthy of consideration. But it also means insisting that EdTech companies rework their business model, something they will be very unlikely to do on their own because it is simply too lucrative to do otherwise. Instead these companies insist that their technology-based solutions are the “only” solutions to the educational problems we face and they prey on schools, educators, and students alike to achieve their business goals.
When it comes to offering EdTech platforms as the solution to the problems that currently exist in education, EdTech companies make four erroneous assumptions:
Assumption #1: The answer to teacher burn-out is more tech in the classroom.
Unequivocally, teacher burn-out is real and oversized classes contribute to this. Teachers contact me regularly to say they are either giving in to the tech takeover because they feel like they don’t have a choice, or they are leaving the profession altogether. Savvy EdTech companies know teachers are leaving and that new recruiting strategies are needed to hire and retain, as well as to reduce burnout. For a classroom with 30+ students, such arguments go, the solution is an AI tool that will allow teachers to differentiate instruction (i.e. meet each student at their individual learning level), create engaging lessons, save time, and personalize learning to each student.
The problem is that such companies assume that teachers will always have 30+ students in a class and that’s a partial reason for teacher burn-out. But there is widely documented evidence that smaller class size reduces teacher stress and increases student engagement. Unfortunately, there isn’t a tech-based solution to reducing class size, so the “solution” offered by EdTech is more tech-based tools to “solve” the issue.
PowerSchool’s own marketing language argues “it is impossible for teachers to [meet the needs of 30 students] without AI.” As a former teacher myself, I find this insulting. Teachers do this all the time, every day, at every age group. They make it work, because they have to. Of course it is possible and common– but that doesn’t mean it is sustainable. The answer isn’t to throw an AI bot at the teacher (inviting them to spend more time behind a computer screen– I can’t even imagine how much time it would take to read the AI generated reports of each of those 30 students after a day of school– how does that make me a better teacher?), but to reduce class size.
Unfortunately, EdTech platforms and AI can’t help with that.
Assumption #2: Learning should be frictionless. “Efficient” = “better”.
Before I extrapolate on this one, please remember: the very same people (mostly men) claiming that our children “need” these technological tools and products now so they can be “successful” in the future had mostly analog childhoods, did not have AI tools themselves as children in school, and send their own children to low- and no-tech schools. They were educated through the very friction-based struggles they are removing from our own childrens’ experiences.
One of the great claims about modern digital technology is that it will make our lives so much easier. In a few ways, perhaps this has been the case, though “easier” and “better” might be defined differently by different people. Over the same period of time, however, we’ve also seen increased rates of loneliness, even among adults; decreased mental health, especially among young people; and increased amounts of time behind our devices, both adults and children.
EdTech companies like to talk about the “Goldilocks” problem– that some material is too easy for some students and some is too hard. Fortunately, they posit, we can address this problem by letting AI “get it right” for students at every level. Again, this is not a new problem. Any experienced teacher will tell you that children are not standardized and never have been, and therefore teachers have always had to work to meet different learning levels and needs. That’s what makes a good teacher a good teacher. This is called differentiation and it has been around as a teaching strategy decades before AI and EdTech hijacked the term. Of course, as class sizes have increased and testing standards have dictated curricula, teachers have much less time, energy, or incentive to do this today. As a result, we get products like AI “assistants” that claim to do this for teachers so they don’t have to.
It is possible that there are technological tools that could help a teacher differentiate learning levels in a standardized way as they are trained on the very standards that drive testing in the first place and are “in alignment” with district curriculum. But that’s not necessarily a good thing because that generated material may not be inspiring, interesting, or even equitable. It also assumes that learning is linear and chronological and predictable and tidy, which, as any teacher of any student knows, is utter nonsense. It is an effective teacher who creates an environment where learning is engaging and fun and that happens through the relationships that teacher builds with students.
If we’re giving children AI-enhanced “learning tools” we are making some serious assumptions about what motivates children to “learn.” Arguing that students will willingly click “challenge me” in a learning platform or tool assumes that 1) students are intrinsically seeking a challenge and 2) the type of challenge offered will be what they need. This rests on the false assumption that most students will seek out increased challenge, when in fact, even for gifted and talented students, this is rarely true. It is in our nature to choose the path of least resistance.
For children raised on so much frictionless tech, where anything they want is available with the tap of a screen, why would they ever click an option that leads to more work? They’ve spent many years of their young lives being shown how tech makes life easier, faster, and better. It’s why we can’t be at all surprised that children are using ChatGPT and other forms of GenAI to do their homework and write their essays. Of course they will. It is shocking that we would imagine any other outcome from providing them access to a homework machine that is far faster than a Google search.
Developmentally, children seek challenge when they are inspired or motivated in the context of a relationship with other humans– not because an AI bot asked if they wanted it.
Assumption #3: AI chatbots will make kids feel good.
There is increasing evidence that children who engage with chatbots are at risk of serious harm. While “educational” chatbots might claim to differentiate themselves from social or mental health supportive AI, if they are built on the same set of foundational models, the risks to children are the same.
When I first drafted this essay over a year ago, I clipped this quote from PowerSchool’s explanation of their “Caring” approach to using PowerBuddy:
“PowerBuddy is built to support your community, fully. That's because we care about how kids, families, and educators are doing as much as how to get them the information needed. We do this by nurturing a 'no wrong question' environment that welcomes anyone and any what's, how's, and why's.”
Note the altruistic tone of this claim– PowerSchool cares! They are judgment-free! They are inclusive! Who wouldn’t want that?
Apparently, PowerSchool. Today as I write this in July of 2025, I can no longer find the quote to hyperlink.
Instead, I see a new claim that:
“PowerBuddy provides students with a conversational AI tutor tailored to their individual learning needs and grade level. PowerBuddy gives students agency and ownership of their learning by letting them choose the type of help they need, whenever they need it, to ensure an equitable learning experience.”
Removing the “care” element from their own marketing materials reveals an interesting change in strategy. There is no question of course that children (and adults) learn better in the context of human relationships. Children build stronger social skills when they play with real children in the physical world. Teachers learn personality quirks and about student passions when they engage them in conversations or observe them throughout the day. Children learn to trust the adult who takes the time to get to know them and their interests. When they care for and trust their teacher, they are inspired and motivated to learn. When children have ample opportunities to play with their peers, they learn critical social and communication skills– skills that are absolutely essential for future success in any career (yes, even technological ones).
Chatbots, companies claim, won’t ever make a child feel judged; will create a fun learning environment; and will only ever offer “district-approved content”-- no worries about encountering inappropriate content online. Within this context, chatbots will help those kids who don’t have at-home support do their homework. How? Because “PowerBuddy knows everything about me”-- it knows what assignments in Schoology I’ve completed, what text I’ve read (online, of course), how long I spent in earlier tasks. Then, taking all that data, it will “help” me with my homework.
PowerSchool notes that “1 in 5 students don’t have access to a school counselor” in America right now. That’s a horrifying statistic. Yet their solution isn’t to recommend schools use funds to hire more counselors; it’s to pay for an AI bot to operate in lieu of a counselor. We’ve heard devastating examples of chatbots inducing young people to die by suicide or adults who’ve experienced severe mental health spirals because chatbots contantly validate and reinforce our thinking, no matter how delusional it gets.
There is no question there is a youth mental health crisis occurring in America right now. But to offer up an AI bot as a “buddy” to help children through these troubling times is deeply dystopic. What kids need now more than ever are supportive real-life, real-world relationships. Children cannot learn when stressed; learning occurs in the context of a caring relationship. It doesn’t matter how savvy AI chatbots may be at “helping” them with homework.
If a child is stressed or lonely, homework isn’t the problem.
Assumption #4: Profiting off student data is OK if it’s “helping” kids learn
We come to what I see as the most dystopian part of AI as the “solution” for problems in education: even if it works (it doesn’t), it comes at a very steep cost: student data privacy. EdTech companies do not advertise widely their methodologies, as it would (rightfully) raise serious concerns, but those of us who have been following along for a while now know that the only way many of these companies are worth billions of dollars comes not from how helpful they are, but how much data they collect and then sell.
Such companies will talk about the importance of “protecting” data in the context of a “walled garden” to protect student privacy. But then these companies also freely admit that tools like AI chatbots or tools are most effective when they have access to large amounts of data. The more data sets there are for each individual user, the more companies can piece together the information that will help teachers personalize it all. That makes sense– you can’t personalize something without having personal information.
But that personal information includes everything from test scores to behavior records to special education reports. Data makes AI powerful and “smart.” The more data we provide it, the “smarter” it becomes (and this is not the same thing as “intelligence”). But, of course, if this is true, then such companies cannot also claim that “student data privacy is of the utmost importance.”
PowerSchool is by no means the only EdTech company to profit off student data. In this regard, most EdTech companies operate very much like Meta and Instagram, who also profit off user data (EdTech is just Big Tech in a sweater vest, as I say). Unfortunately, EdTech’s business model, cloaked in altruism, profits off of non-consenting minors, not (theoretically) consenting adults.
The fundamental problem with EdTech platforms (outside of their business models) is their commitment to false assumptions that generate fear, prey on schools and children, and invalidate and insult the expertise, wisdom, and necessity of actual real-life educators.
I am not anti-technology. Indeed, I know there is a role for it, but what is currently being offered as a solution is absolutely not what is good for children, teaching, and learning. So what is the solution?
The answer to the challenges facing schools today is not more technology. It’s about being tech-intentional™.
It is schools who adopt the Four Norms of EdTech and become a tech-intentional certified school.
It means using technology in alignment with what we know about child development.
It means intentionally integrating technology in a way that builds on the skills children need first (like executive function) before offering them digital tools.
It means emphasizing play as a critical form of learning and ensuring that children get access to pencil and paper before tablets and tech.
It means resisting, delaying, and limiting any form of tech that interferes with healthy mental, physical, cognitive, or emotional development.
It means countering the assumptions above by implementing the evidence-based solutions we know do work for improving education: reduced class sizes, decreased emphasis on standardized tests, decreased time on screens, more time in play and real-life relationships with children and adults, better pay for teachers, more school counselors, and far fewer contractual relationships with EdTech companies that mine children’s data.
It’s an unpopular stance to take in the face of so much pressure from such powerful companies with huge marketing budgets.
But it’s the right thing to do.
We have recently launched a membership platform to support parents, educators, and school administrators in building tech-intentional™ communities. You can learn more here.
Learning should be hard. That's the whole point. Friction and struggle lead to insight and knowledge and humans best persevere through moments of struggle in the context of trusted human relationships.
But when AI-infused EdTech tools try to remove friction, they are also removing learning. Such products are fundamentally at odds with educational goals.
Some will argue that the American public education system exists to indoctrinate a future workforce and the purpose of education is only to turn out future workers. I disagree and I’m not going to debate that here; if that’s what you believe, then this essay isn’t for you.
But if you believe like I do that the purpose of education– and public education in particular– is to prepare children to become thinking, feeling members of a society for whatever future they will inherit, and which is achieved by learning both about their past while preparing for the skills of the future (the skills children have always needed to be successful, like empathy, critical thinking, and communication), then this essay is for you.
Many EdTech companies are pitching “AI solutions” to address the challenges facing schools and educators– challenges like oversized classrooms, overworked teachers, overwhelmed parents, and overstretched administrators. None of these challenges are new, of course, but when industry has a product to market, as they do with AI tools, they look for a vulnerable audience, and resource-strapped schools are an easy target. (The “low cost” of AI tools for schools comes in the form of “high costs” to student data and privacy– a devil’s bargain.)
As one example, last year PowerSchool, the largest EdTech provider in North America and parent company of platforms like Schoology and Naviance, with a net worth of over $5 Billion, introduced PowerBuddy: an AI bot that “delivers personalized insights, fosters engagement, and creates a supportive environment for everyone at every step in their educational journey.”
PowerSchool makes several sweeping claims about what PowerBuddy can do.
Imagine Each Student has a Digital Helper With PowerBuddy. Students have a personalized conversational AI guide to assist them as they learn!
Now Every Teacher has the Perfect Assistant. Giving teachers time back, PowerBuddy helps them quickly develop personalized and effective lesson plans and assessments.
Parents Can Get Involved and Stay Involved. With PowerBuddy, families have a reliable connection with their child's teacher and school.
Every Administrator has Super Powers. Administrators can depend on PowerBuddy to access, interpret, and share key information with stakeholders, ensuring transparency and empowering collaboration.
Every Student is Ready for College, Career, and Life. PowerBuddy makes career exploration and post-secondary planning simple, personalized, and accessible for more students.
IT and Data Analysts Can Work Faster. Leverage the power of conversational AI for faster data mining and statistical analysis. With PowerBuddy, you can essentially talk to your data!
I’ll be the first to stand up and say that schools, administrators, teachers, and parents need more support.
But the support they and we need does not look like this.
Let me start with the obvious: the business structure of EdTech and BigTech is the same, their leadership models are the same, and their profit margins are the same– enormous. In other words, EdTech is just Big Tech in a sweater vest. Just because PowerSchool and its brethren work with schools and claim to be in the business of “transforming” education, their cloaked altruism is misleading, dystopic, and self-serving. (This has always troubled me: Why do technologists get to be the experts of teaching and learning? Isn’t that what educators are for?)
There is a role for technology in the classroom– a “tech-intentional” one, in fact. Students need to learn to type, research, and identify sources of information. They need to understand how computers, the internet, and AI products work. Some need higher math modeling tools or to utilize video-based content to illustrate abstract concepts. But the problem with the current model is the presumption that because we can and do have the technological capabilities therefore all learning can or should take place digitally. This is, of course, nonsense. High-level math modeling for high school seniors is not the same as teaching children in kindergarten how to read via a digital app. We don’t need to give Kindergartners tablets to ensure they know how to use technology safely or purposefully when they graduate from HIgh School. This isn’t hard– we already do it with how we teach teenagers about substance abuse without providing them access to drugs or about safe sex and consent without requiring them to engage in it.
Similarly, TechEd is not EdTech.
We can teach children to type on non-internet computers (or even just keyboards or typewriters, for that matter). We can educate students about quality research skills, assessing studies for bias or problematic funding, and how to create citations without providing them the internet to do so. (How will children know a citation is a bad or false one if they haven’t been taught what it means to generate a factual or accurate one first? How can they discern “truth” if they lack the critical thinking skills to know how to differentiate between fact and fiction?)
Arguing that children “need” internet-connected digital technologies like AI to “be prepared” for the future is backwards and inconsistent with both child development and how humans learn. We don’t teach babies to walk before they crawl; we don’t teach children to read before they know their letters; we don’t expect engineers to code without also knowing how to communicate their ideas or hold a conversation. Some skills, such as critical thinking or empathy, cannot be taught via a screen; these skills must come before we give children access to technology. I beat this drum over and over for a reason: Skills before screens. Learning happens in the context of relationships.
Redefining and prioritizing what skills need to be taught before screens are introduced in schools is definitely a topic worthy of consideration. But it also means insisting that EdTech companies rework their business model, something they will be very unlikely to do on their own because it is simply too lucrative to do otherwise. Instead these companies insist that their technology-based solutions are the “only” solutions to the educational problems we face and they prey on schools, educators, and students alike to achieve their business goals.
When it comes to offering EdTech platforms as the solution to the problems that currently exist in education, EdTech companies make four erroneous assumptions:
Assumption #1: The answer to teacher burn-out is more tech in the classroom.
Unequivocally, teacher burn-out is real and oversized classes contribute to this. Teachers contact me regularly to say they are either giving in to the tech takeover because they feel like they don’t have a choice, or they are leaving the profession altogether. Savvy EdTech companies know teachers are leaving and that new recruiting strategies are needed to hire and retain, as well as to reduce burnout. For a classroom with 30+ students, such arguments go, the solution is an AI tool that will allow teachers to differentiate instruction (i.e. meet each student at their individual learning level), create engaging lessons, save time, and personalize learning to each student.
The problem is that such companies assume that teachers will always have 30+ students in a class and that’s a partial reason for teacher burn-out. But there is widely documented evidence that smaller class size reduces teacher stress and increases student engagement. Unfortunately, there isn’t a tech-based solution to reducing class size, so the “solution” offered by EdTech is more tech-based tools to “solve” the issue.
PowerSchool’s own marketing language argues “it is impossible for teachers to [meet the needs of 30 students] without AI.” As a former teacher myself, I find this insulting. Teachers do this all the time, every day, at every age group. They make it work, because they have to. Of course it is possible and common– but that doesn’t mean it is sustainable. The answer isn’t to throw an AI bot at the teacher (inviting them to spend more time behind a computer screen– I can’t even imagine how much time it would take to read the AI generated reports of each of those 30 students after a day of school– how does that make me a better teacher?), but to reduce class size.
Unfortunately, EdTech platforms and AI can’t help with that.
Assumption #2: Learning should be frictionless. “Efficient” = “better”.
Before I extrapolate on this one, please remember: the very same people (mostly men) claiming that our children “need” these technological tools and products now so they can be “successful” in the future had mostly analog childhoods, did not have AI tools themselves as children in school, and send their own children to low- and no-tech schools. They were educated through the very friction-based struggles they are removing from our own childrens’ experiences.
One of the great claims about modern digital technology is that it will make our lives so much easier. In a few ways, perhaps this has been the case, though “easier” and “better” might be defined differently by different people. Over the same period of time, however, we’ve also seen increased rates of loneliness, even among adults; decreased mental health, especially among young people; and increased amounts of time behind our devices, both adults and children.
EdTech companies like to talk about the “Goldilocks” problem– that some material is too easy for some students and some is too hard. Fortunately, they posit, we can address this problem by letting AI “get it right” for students at every level. Again, this is not a new problem. Any experienced teacher will tell you that children are not standardized and never have been, and therefore teachers have always had to work to meet different learning levels and needs. That’s what makes a good teacher a good teacher. This is called differentiation and it has been around as a teaching strategy decades before AI and EdTech hijacked the term. Of course, as class sizes have increased and testing standards have dictated curricula, teachers have much less time, energy, or incentive to do this today. As a result, we get products like AI “assistants” that claim to do this for teachers so they don’t have to.
It is possible that there are technological tools that could help a teacher differentiate learning levels in a standardized way as they are trained on the very standards that drive testing in the first place and are “in alignment” with district curriculum. But that’s not necessarily a good thing because that generated material may not be inspiring, interesting, or even equitable. It also assumes that learning is linear and chronological and predictable and tidy, which, as any teacher of any student knows, is utter nonsense. It is an effective teacher who creates an environment where learning is engaging and fun and that happens through the relationships that teacher builds with students.
If we’re giving children AI-enhanced “learning tools” we are making some serious assumptions about what motivates children to “learn.” Arguing that students will willingly click “challenge me” in a learning platform or tool assumes that 1) students are intrinsically seeking a challenge and 2) the type of challenge offered will be what they need. This rests on the false assumption that most students will seek out increased challenge, when in fact, even for gifted and talented students, this is rarely true. It is in our nature to choose the path of least resistance.
For children raised on so much frictionless tech, where anything they want is available with the tap of a screen, why would they ever click an option that leads to more work? They’ve spent many years of their young lives being shown how tech makes life easier, faster, and better. It’s why we can’t be at all surprised that children are using ChatGPT and other forms of GenAI to do their homework and write their essays. Of course they will. It is shocking that we would imagine any other outcome from providing them access to a homework machine that is far faster than a Google search.
Developmentally, children seek challenge when they are inspired or motivated in the context of a relationship with other humans– not because an AI bot asked if they wanted it.
Assumption #3: AI chatbots will make kids feel good.
There is increasing evidence that children who engage with chatbots are at risk of serious harm. While “educational” chatbots might claim to differentiate themselves from social or mental health supportive AI, if they are built on the same set of foundational models, the risks to children are the same.
When I first drafted this essay over a year ago, I clipped this quote from PowerSchool’s explanation of their “Caring” approach to using PowerBuddy:
“PowerBuddy is built to support your community, fully. That's because we care about how kids, families, and educators are doing as much as how to get them the information needed. We do this by nurturing a 'no wrong question' environment that welcomes anyone and any what's, how's, and why's.”
Note the altruistic tone of this claim– PowerSchool cares! They are judgment-free! They are inclusive! Who wouldn’t want that?
Apparently, PowerSchool. Today as I write this in July of 2025, I can no longer find the quote to hyperlink.
Instead, I see a new claim that:
“PowerBuddy provides students with a conversational AI tutor tailored to their individual learning needs and grade level. PowerBuddy gives students agency and ownership of their learning by letting them choose the type of help they need, whenever they need it, to ensure an equitable learning experience.”
Removing the “care” element from their own marketing materials reveals an interesting change in strategy. There is no question of course that children (and adults) learn better in the context of human relationships. Children build stronger social skills when they play with real children in the physical world. Teachers learn personality quirks and about student passions when they engage them in conversations or observe them throughout the day. Children learn to trust the adult who takes the time to get to know them and their interests. When they care for and trust their teacher, they are inspired and motivated to learn. When children have ample opportunities to play with their peers, they learn critical social and communication skills– skills that are absolutely essential for future success in any career (yes, even technological ones).
Chatbots, companies claim, won’t ever make a child feel judged; will create a fun learning environment; and will only ever offer “district-approved content”-- no worries about encountering inappropriate content online. Within this context, chatbots will help those kids who don’t have at-home support do their homework. How? Because “PowerBuddy knows everything about me”-- it knows what assignments in Schoology I’ve completed, what text I’ve read (online, of course), how long I spent in earlier tasks. Then, taking all that data, it will “help” me with my homework.
PowerSchool notes that “1 in 5 students don’t have access to a school counselor” in America right now. That’s a horrifying statistic. Yet their solution isn’t to recommend schools use funds to hire more counselors; it’s to pay for an AI bot to operate in lieu of a counselor. We’ve heard devastating examples of chatbots inducing young people to die by suicide or adults who’ve experienced severe mental health spirals because chatbots contantly validate and reinforce our thinking, no matter how delusional it gets.
There is no question there is a youth mental health crisis occurring in America right now. But to offer up an AI bot as a “buddy” to help children through these troubling times is deeply dystopic. What kids need now more than ever are supportive real-life, real-world relationships. Children cannot learn when stressed; learning occurs in the context of a caring relationship. It doesn’t matter how savvy AI chatbots may be at “helping” them with homework.
If a child is stressed or lonely, homework isn’t the problem.
Assumption #4: Profiting off student data is OK if it’s “helping” kids learn
We come to what I see as the most dystopian part of AI as the “solution” for problems in education: even if it works (it doesn’t), it comes at a very steep cost: student data privacy. EdTech companies do not advertise widely their methodologies, as it would (rightfully) raise serious concerns, but those of us who have been following along for a while now know that the only way many of these companies are worth billions of dollars comes not from how helpful they are, but how much data they collect and then sell.
Such companies will talk about the importance of “protecting” data in the context of a “walled garden” to protect student privacy. But then these companies also freely admit that tools like AI chatbots or tools are most effective when they have access to large amounts of data. The more data sets there are for each individual user, the more companies can piece together the information that will help teachers personalize it all. That makes sense– you can’t personalize something without having personal information.
But that personal information includes everything from test scores to behavior records to special education reports. Data makes AI powerful and “smart.” The more data we provide it, the “smarter” it becomes (and this is not the same thing as “intelligence”). But, of course, if this is true, then such companies cannot also claim that “student data privacy is of the utmost importance.”
PowerSchool is by no means the only EdTech company to profit off student data. In this regard, most EdTech companies operate very much like Meta and Instagram, who also profit off user data (EdTech is just Big Tech in a sweater vest, as I say). Unfortunately, EdTech’s business model, cloaked in altruism, profits off of non-consenting minors, not (theoretically) consenting adults.
The fundamental problem with EdTech platforms (outside of their business models) is their commitment to false assumptions that generate fear, prey on schools and children, and invalidate and insult the expertise, wisdom, and necessity of actual real-life educators.
I am not anti-technology. Indeed, I know there is a role for it, but what is currently being offered as a solution is absolutely not what is good for children, teaching, and learning. So what is the solution?
The answer to the challenges facing schools today is not more technology. It’s about being tech-intentional™.
It is schools who adopt the Four Norms of EdTech and become a tech-intentional certified school.
It means using technology in alignment with what we know about child development.
It means intentionally integrating technology in a way that builds on the skills children need first (like executive function) before offering them digital tools.
It means emphasizing play as a critical form of learning and ensuring that children get access to pencil and paper before tablets and tech.
It means resisting, delaying, and limiting any form of tech that interferes with healthy mental, physical, cognitive, or emotional development.
It means countering the assumptions above by implementing the evidence-based solutions we know do work for improving education: reduced class sizes, decreased emphasis on standardized tests, decreased time on screens, more time in play and real-life relationships with children and adults, better pay for teachers, more school counselors, and far fewer contractual relationships with EdTech companies that mine children’s data.
It’s an unpopular stance to take in the face of so much pressure from such powerful companies with huge marketing budgets.
But it’s the right thing to do.
We have recently launched a membership platform to support parents, educators, and school administrators in building tech-intentional™ communities. You can learn more here.
Learning should be hard. That's the whole point. Friction and struggle lead to insight and knowledge and humans best persevere through moments of struggle in the context of trusted human relationships.
But when AI-infused EdTech tools try to remove friction, they are also removing learning. Such products are fundamentally at odds with educational goals.
Some will argue that the American public education system exists to indoctrinate a future workforce and the purpose of education is only to turn out future workers. I disagree and I’m not going to debate that here; if that’s what you believe, then this essay isn’t for you.
But if you believe like I do that the purpose of education– and public education in particular– is to prepare children to become thinking, feeling members of a society for whatever future they will inherit, and which is achieved by learning both about their past while preparing for the skills of the future (the skills children have always needed to be successful, like empathy, critical thinking, and communication), then this essay is for you.
Many EdTech companies are pitching “AI solutions” to address the challenges facing schools and educators– challenges like oversized classrooms, overworked teachers, overwhelmed parents, and overstretched administrators. None of these challenges are new, of course, but when industry has a product to market, as they do with AI tools, they look for a vulnerable audience, and resource-strapped schools are an easy target. (The “low cost” of AI tools for schools comes in the form of “high costs” to student data and privacy– a devil’s bargain.)
As one example, last year PowerSchool, the largest EdTech provider in North America and parent company of platforms like Schoology and Naviance, with a net worth of over $5 Billion, introduced PowerBuddy: an AI bot that “delivers personalized insights, fosters engagement, and creates a supportive environment for everyone at every step in their educational journey.”
PowerSchool makes several sweeping claims about what PowerBuddy can do.
Imagine Each Student has a Digital Helper With PowerBuddy. Students have a personalized conversational AI guide to assist them as they learn!
Now Every Teacher has the Perfect Assistant. Giving teachers time back, PowerBuddy helps them quickly develop personalized and effective lesson plans and assessments.
Parents Can Get Involved and Stay Involved. With PowerBuddy, families have a reliable connection with their child's teacher and school.
Every Administrator has Super Powers. Administrators can depend on PowerBuddy to access, interpret, and share key information with stakeholders, ensuring transparency and empowering collaboration.
Every Student is Ready for College, Career, and Life. PowerBuddy makes career exploration and post-secondary planning simple, personalized, and accessible for more students.
IT and Data Analysts Can Work Faster. Leverage the power of conversational AI for faster data mining and statistical analysis. With PowerBuddy, you can essentially talk to your data!
I’ll be the first to stand up and say that schools, administrators, teachers, and parents need more support.
But the support they and we need does not look like this.
Let me start with the obvious: the business structure of EdTech and BigTech is the same, their leadership models are the same, and their profit margins are the same– enormous. In other words, EdTech is just Big Tech in a sweater vest. Just because PowerSchool and its brethren work with schools and claim to be in the business of “transforming” education, their cloaked altruism is misleading, dystopic, and self-serving. (This has always troubled me: Why do technologists get to be the experts of teaching and learning? Isn’t that what educators are for?)
There is a role for technology in the classroom– a “tech-intentional” one, in fact. Students need to learn to type, research, and identify sources of information. They need to understand how computers, the internet, and AI products work. Some need higher math modeling tools or to utilize video-based content to illustrate abstract concepts. But the problem with the current model is the presumption that because we can and do have the technological capabilities therefore all learning can or should take place digitally. This is, of course, nonsense. High-level math modeling for high school seniors is not the same as teaching children in kindergarten how to read via a digital app. We don’t need to give Kindergartners tablets to ensure they know how to use technology safely or purposefully when they graduate from HIgh School. This isn’t hard– we already do it with how we teach teenagers about substance abuse without providing them access to drugs or about safe sex and consent without requiring them to engage in it.
Similarly, TechEd is not EdTech.
We can teach children to type on non-internet computers (or even just keyboards or typewriters, for that matter). We can educate students about quality research skills, assessing studies for bias or problematic funding, and how to create citations without providing them the internet to do so. (How will children know a citation is a bad or false one if they haven’t been taught what it means to generate a factual or accurate one first? How can they discern “truth” if they lack the critical thinking skills to know how to differentiate between fact and fiction?)
Arguing that children “need” internet-connected digital technologies like AI to “be prepared” for the future is backwards and inconsistent with both child development and how humans learn. We don’t teach babies to walk before they crawl; we don’t teach children to read before they know their letters; we don’t expect engineers to code without also knowing how to communicate their ideas or hold a conversation. Some skills, such as critical thinking or empathy, cannot be taught via a screen; these skills must come before we give children access to technology. I beat this drum over and over for a reason: Skills before screens. Learning happens in the context of relationships.
Redefining and prioritizing what skills need to be taught before screens are introduced in schools is definitely a topic worthy of consideration. But it also means insisting that EdTech companies rework their business model, something they will be very unlikely to do on their own because it is simply too lucrative to do otherwise. Instead these companies insist that their technology-based solutions are the “only” solutions to the educational problems we face and they prey on schools, educators, and students alike to achieve their business goals.
When it comes to offering EdTech platforms as the solution to the problems that currently exist in education, EdTech companies make four erroneous assumptions:
Assumption #1: The answer to teacher burn-out is more tech in the classroom.
Unequivocally, teacher burn-out is real and oversized classes contribute to this. Teachers contact me regularly to say they are either giving in to the tech takeover because they feel like they don’t have a choice, or they are leaving the profession altogether. Savvy EdTech companies know teachers are leaving and that new recruiting strategies are needed to hire and retain, as well as to reduce burnout. For a classroom with 30+ students, such arguments go, the solution is an AI tool that will allow teachers to differentiate instruction (i.e. meet each student at their individual learning level), create engaging lessons, save time, and personalize learning to each student.
The problem is that such companies assume that teachers will always have 30+ students in a class and that’s a partial reason for teacher burn-out. But there is widely documented evidence that smaller class size reduces teacher stress and increases student engagement. Unfortunately, there isn’t a tech-based solution to reducing class size, so the “solution” offered by EdTech is more tech-based tools to “solve” the issue.
PowerSchool’s own marketing language argues “it is impossible for teachers to [meet the needs of 30 students] without AI.” As a former teacher myself, I find this insulting. Teachers do this all the time, every day, at every age group. They make it work, because they have to. Of course it is possible and common– but that doesn’t mean it is sustainable. The answer isn’t to throw an AI bot at the teacher (inviting them to spend more time behind a computer screen– I can’t even imagine how much time it would take to read the AI generated reports of each of those 30 students after a day of school– how does that make me a better teacher?), but to reduce class size.
Unfortunately, EdTech platforms and AI can’t help with that.
Assumption #2: Learning should be frictionless. “Efficient” = “better”.
Before I extrapolate on this one, please remember: the very same people (mostly men) claiming that our children “need” these technological tools and products now so they can be “successful” in the future had mostly analog childhoods, did not have AI tools themselves as children in school, and send their own children to low- and no-tech schools. They were educated through the very friction-based struggles they are removing from our own childrens’ experiences.
One of the great claims about modern digital technology is that it will make our lives so much easier. In a few ways, perhaps this has been the case, though “easier” and “better” might be defined differently by different people. Over the same period of time, however, we’ve also seen increased rates of loneliness, even among adults; decreased mental health, especially among young people; and increased amounts of time behind our devices, both adults and children.
EdTech companies like to talk about the “Goldilocks” problem– that some material is too easy for some students and some is too hard. Fortunately, they posit, we can address this problem by letting AI “get it right” for students at every level. Again, this is not a new problem. Any experienced teacher will tell you that children are not standardized and never have been, and therefore teachers have always had to work to meet different learning levels and needs. That’s what makes a good teacher a good teacher. This is called differentiation and it has been around as a teaching strategy decades before AI and EdTech hijacked the term. Of course, as class sizes have increased and testing standards have dictated curricula, teachers have much less time, energy, or incentive to do this today. As a result, we get products like AI “assistants” that claim to do this for teachers so they don’t have to.
It is possible that there are technological tools that could help a teacher differentiate learning levels in a standardized way as they are trained on the very standards that drive testing in the first place and are “in alignment” with district curriculum. But that’s not necessarily a good thing because that generated material may not be inspiring, interesting, or even equitable. It also assumes that learning is linear and chronological and predictable and tidy, which, as any teacher of any student knows, is utter nonsense. It is an effective teacher who creates an environment where learning is engaging and fun and that happens through the relationships that teacher builds with students.
If we’re giving children AI-enhanced “learning tools” we are making some serious assumptions about what motivates children to “learn.” Arguing that students will willingly click “challenge me” in a learning platform or tool assumes that 1) students are intrinsically seeking a challenge and 2) the type of challenge offered will be what they need. This rests on the false assumption that most students will seek out increased challenge, when in fact, even for gifted and talented students, this is rarely true. It is in our nature to choose the path of least resistance.
For children raised on so much frictionless tech, where anything they want is available with the tap of a screen, why would they ever click an option that leads to more work? They’ve spent many years of their young lives being shown how tech makes life easier, faster, and better. It’s why we can’t be at all surprised that children are using ChatGPT and other forms of GenAI to do their homework and write their essays. Of course they will. It is shocking that we would imagine any other outcome from providing them access to a homework machine that is far faster than a Google search.
Developmentally, children seek challenge when they are inspired or motivated in the context of a relationship with other humans– not because an AI bot asked if they wanted it.
Assumption #3: AI chatbots will make kids feel good.
There is increasing evidence that children who engage with chatbots are at risk of serious harm. While “educational” chatbots might claim to differentiate themselves from social or mental health supportive AI, if they are built on the same set of foundational models, the risks to children are the same.
When I first drafted this essay over a year ago, I clipped this quote from PowerSchool’s explanation of their “Caring” approach to using PowerBuddy:
“PowerBuddy is built to support your community, fully. That's because we care about how kids, families, and educators are doing as much as how to get them the information needed. We do this by nurturing a 'no wrong question' environment that welcomes anyone and any what's, how's, and why's.”
Note the altruistic tone of this claim– PowerSchool cares! They are judgment-free! They are inclusive! Who wouldn’t want that?
Apparently, PowerSchool. Today as I write this in July of 2025, I can no longer find the quote to hyperlink.
Instead, I see a new claim that:
“PowerBuddy provides students with a conversational AI tutor tailored to their individual learning needs and grade level. PowerBuddy gives students agency and ownership of their learning by letting them choose the type of help they need, whenever they need it, to ensure an equitable learning experience.”
Removing the “care” element from their own marketing materials reveals an interesting change in strategy. There is no question of course that children (and adults) learn better in the context of human relationships. Children build stronger social skills when they play with real children in the physical world. Teachers learn personality quirks and about student passions when they engage them in conversations or observe them throughout the day. Children learn to trust the adult who takes the time to get to know them and their interests. When they care for and trust their teacher, they are inspired and motivated to learn. When children have ample opportunities to play with their peers, they learn critical social and communication skills– skills that are absolutely essential for future success in any career (yes, even technological ones).
Chatbots, companies claim, won’t ever make a child feel judged; will create a fun learning environment; and will only ever offer “district-approved content”-- no worries about encountering inappropriate content online. Within this context, chatbots will help those kids who don’t have at-home support do their homework. How? Because “PowerBuddy knows everything about me”-- it knows what assignments in Schoology I’ve completed, what text I’ve read (online, of course), how long I spent in earlier tasks. Then, taking all that data, it will “help” me with my homework.
PowerSchool notes that “1 in 5 students don’t have access to a school counselor” in America right now. That’s a horrifying statistic. Yet their solution isn’t to recommend schools use funds to hire more counselors; it’s to pay for an AI bot to operate in lieu of a counselor. We’ve heard devastating examples of chatbots inducing young people to die by suicide or adults who’ve experienced severe mental health spirals because chatbots contantly validate and reinforce our thinking, no matter how delusional it gets.
There is no question there is a youth mental health crisis occurring in America right now. But to offer up an AI bot as a “buddy” to help children through these troubling times is deeply dystopic. What kids need now more than ever are supportive real-life, real-world relationships. Children cannot learn when stressed; learning occurs in the context of a caring relationship. It doesn’t matter how savvy AI chatbots may be at “helping” them with homework.
If a child is stressed or lonely, homework isn’t the problem.
Assumption #4: Profiting off student data is OK if it’s “helping” kids learn
We come to what I see as the most dystopian part of AI as the “solution” for problems in education: even if it works (it doesn’t), it comes at a very steep cost: student data privacy. EdTech companies do not advertise widely their methodologies, as it would (rightfully) raise serious concerns, but those of us who have been following along for a while now know that the only way many of these companies are worth billions of dollars comes not from how helpful they are, but how much data they collect and then sell.
Such companies will talk about the importance of “protecting” data in the context of a “walled garden” to protect student privacy. But then these companies also freely admit that tools like AI chatbots or tools are most effective when they have access to large amounts of data. The more data sets there are for each individual user, the more companies can piece together the information that will help teachers personalize it all. That makes sense– you can’t personalize something without having personal information.
But that personal information includes everything from test scores to behavior records to special education reports. Data makes AI powerful and “smart.” The more data we provide it, the “smarter” it becomes (and this is not the same thing as “intelligence”). But, of course, if this is true, then such companies cannot also claim that “student data privacy is of the utmost importance.”
PowerSchool is by no means the only EdTech company to profit off student data. In this regard, most EdTech companies operate very much like Meta and Instagram, who also profit off user data (EdTech is just Big Tech in a sweater vest, as I say). Unfortunately, EdTech’s business model, cloaked in altruism, profits off of non-consenting minors, not (theoretically) consenting adults.
The fundamental problem with EdTech platforms (outside of their business models) is their commitment to false assumptions that generate fear, prey on schools and children, and invalidate and insult the expertise, wisdom, and necessity of actual real-life educators.
I am not anti-technology. Indeed, I know there is a role for it, but what is currently being offered as a solution is absolutely not what is good for children, teaching, and learning. So what is the solution?
The answer to the challenges facing schools today is not more technology. It’s about being tech-intentional™.
It is schools who adopt the Four Norms of EdTech and become a tech-intentional certified school.
It means using technology in alignment with what we know about child development.
It means intentionally integrating technology in a way that builds on the skills children need first (like executive function) before offering them digital tools.
It means emphasizing play as a critical form of learning and ensuring that children get access to pencil and paper before tablets and tech.
It means resisting, delaying, and limiting any form of tech that interferes with healthy mental, physical, cognitive, or emotional development.
It means countering the assumptions above by implementing the evidence-based solutions we know do work for improving education: reduced class sizes, decreased emphasis on standardized tests, decreased time on screens, more time in play and real-life relationships with children and adults, better pay for teachers, more school counselors, and far fewer contractual relationships with EdTech companies that mine children’s data.
It’s an unpopular stance to take in the face of so much pressure from such powerful companies with huge marketing budgets.
But it’s the right thing to do.
We have recently launched a membership platform to support parents, educators, and school administrators in building tech-intentional™ communities. You can learn more here.
Other Articles
Other Articles
My Son Loves His Math Teacher
My Son Loves His Math Teacher
(Or Why Remote Learning Alone Won’t Save Education)
Jun 20, 2020
How to Minimize the Misery
How to Minimize the Misery
Five Ways to Help Your Kids Survive Remote Learning
Aug 13, 2020
When Your Kids See P*rn on Their School-Issued Device, Guess Who Is Responsible? Hint: It’s Not the School District.
When Your Kids See P*rn on Their School-Issued Device, Guess Who Is Responsible? Hint: It’s Not the School District.
Monitor Your Child’s Remote Learning Device for P*rn and Violent Content. Because The School District Won’t.
Sep 1, 2020