Nov 6, 2024
In February of 2024, PowerSchool, the parent company of the hugely popular learning management system Schoology (among others), hosted a webinar about a new classroom tool they are rolling out called PowerBuddy. PowerSchool describes “PowerBuddy for Learning” as:
“...A personal assistant for teaching and learning. PowerBuddy makes educators’ lives easier by helping them easily create high-quality assignments and instructional content. Students benefit from an always-available personalized assistant to support them in the way they choose to learn.”
Forget that there are thousands of teacher resources already available online (and offline), a regular old Google search can often turn up what teachers might quickly need, or that smaller class sizes might be a more effective solution to helping teachers develop rich and varied content, because according to Shivani Stumpf, Chief Product and Innovation Officer of PowerSchool, it is “impossible” for teachers to differentiate without AI.
Of course, PowerSchool and a myriad of other EdTech companies have a vested interest in AI. The global EdTech market will surpass $400 billion annually by 2024, and 36 EdTech unicorns already exist. AI is the next frontier of digitizing everything we do, and with all the EdTech tools currently in use at schools, classrooms are the next proving ground for AI, even when it is far from ready for primetime.
Children, though naturally curious, need foundational skills to build knowledge. AI assisting them in “the way they choose to learn” identifies one of the key problems with AI for children. It is the role of a trained educator to determine not only content and curriculum, but also pedagogy– “how” content should be taught. A child might have preferences or ideas, but they do not have a fundamental understanding of brain development or skill acquisition. An expert teacher does.
In my view, AI has no business being in the classroom in its current form, in part because AI builds on data that already exists– it scans our feeds or internet search histories or entire digital texts to offer us “answers.” This can certainly be entertaining, but A.I. is trained on data sets that include biases, misinformation, and inaccuracies, and that means it can produce the “wrong” answers too. A child lacking foundational knowledge or higher order thinking skills will not know the difference between fact and fiction.
I believe the issue for schools and education today is less about A.I. in particular and more about the flood of EdTech in general, of which A.I. is a growing piece.
Because for the most part, the news is not good. (This video by my colleague, Dr. Jared Cooney Horvath, is an excellent look at why the EdTech Revolution Has Failed.) Ignoring industry-funded research that supports industry-funded products that “EdTech is good for learning” (it’s not), we can also just listen to the anecdotes of teachers who express concerns that the creep of technology into the classroom. I hear over and over again that technology has ruined teaching and learning, and that the “skills” children are acquiring are not the ones needed for success in adulthood or in future decision making.
What do parents, teachers, and concerned citizens need to know about AI and education?
First, we must be deeply skeptical of any argument that suggests that A.I. can fix what’s wrong with education or can individualize learning to such an extent that teachers can be relegated to “guide on the side.” We’ve been experimenting with technology in the classroom for a long time, investing billions of dollars a year into software and hardware, and yet the educational returns we might expect from such a financial investment simply aren’t there.
We have to start by asking: What makes an experience “stick”? In other words, in what ways do children retain the skills and knowledge they are taught in a classroom, then access those skills or that knowledge later to make meaning from their experiences?
While technology can provide us with information and facts, A.I. and other forms of educational technologies have yet to replace the single most important component of learning: the interpersonal relationship between teacher and student.
What matters most is not the information being imparted, nor the medium in which it is imparted, but the connection between the one imparting the knowledge and the one receiving it.
Artificial intelligence has yet to possess the capability to offer the most human of experiences– empathy– nor the nuance to connect with a child in such a way that the magic “aha moment” of learning occurs. AI can mimic human conversations, but it cannot convey higher order cognitive skills (like critical thinking) because those are rooted in foundational knowledge. When children feel connected, supported, or inspired, knowledge is not only imported, but reinforced and held in their long-term memory– in other words, they have learned something.
What to say to schools when they announce AI is coming or will be used
First, it’s important to ask whether students are going to be taught about AI or use AI. Those are two very different things. I do believe there is an argument to be made to teach children about digital technologies like AI, but in order to do that, ironically, children do not need digital tech. Technology education (TechEd) is not educational technology (EdTech). We can and should talk about the risks and harms associated with a nascent technology that is far from ready for use by children with underdeveloped brains; we do not need to use AI to teach that.
Secondly, as my tech-intentional framework emphasizes, we must ensure that schools are imparting "skills before screens." What does a teacher hope the learning outcomes of teaching about (or on) AI will be? Is there prior knowledge required for children to better understand why mis-information via AI is dangerous (yes) and how to discern what’s true from what’s false (yes)?
Third, the argument that “children must learn AI now to be prepared for the future” is a falsehood. Not only is technology changing so fast that what kids learn today will be irrelevant in a year or so, but context matters and learning is interpersonal. Information (what AI can impart) is not the same thing as knowledge (where meaning is made). It is in relationships with human adults that meaning about information becomes knowledge. AI may be information-heavy, but it is far from knowledge-rich.
Fourth, AI is not safe for kids. Period. This story is making the rounds right now, and while one tragic extreme, this is an area to tread very, very carefully with underdeveloped brains and hearts. PowerSchool’s "AI chatbot PowerBuddy" is coming to schools soon so students can "have a personalized conversational AI guide to assist them as they learn!" But none of this has been tested, even on adults, and certainly not on still-developing children. We don’t need more tech; we need more teachers.
Finally, any "research" about the benefits of AI specifically and EdTech broadly must be independently funded. Any reports or studies that tout the benefits of AI are likely funded by companies who have a vested interest in its success, and very unlikely to be connected to what is actually good for children or learning.
All of this culminates in my belief that schools must continue to be a place where children go to develop contextual knowledge under the tutelage of a caring adult professional. Higher order cognitive skills, such as creativity, critical thinking, and empathy, come from layers of relational experience, and those skills are imperative to the future functioning of our society.
It is time to shift the burden of proof back on to the technology companies. Rather than adopting tech in the hopes that it might support learning, schools should only adopt tech when independent experts can prove that its use has an unequivocally positive impact on learning, protects student’s data and privacy, and is in alignment with child development.
At the end of the day, the most important thing teachers can do for students around AI is to provide a learning environment where they build higher order cognitive skills that lead them to make connections about the potential harms that can come from tools like AI, such as misinformation, hate speech, etc., and which deeply impact structural institutions like democracy. It may also mean a resisting or refusing the excessive EdTech tools and platforms that currently exist and demanding a return to foundational teaching and learning that, paradoxically, will allow children to think critically about the tech they will encounter in the future, and to know that they will understand how to use it, not be used by it.
Getting kids to think critically about AI is the most useful form of AI in the classroom.
P.S.
These blog posts I wrote might be helpful too!
The Problem With AI in Education Isn’t What You Think
What’s Getting Lost in the AI Shuffle: The Case for Teachers. Real Ones.