the screentime consultant logo dark on light
the screentime consultant logo dark on light
the screentime consultant logo dark on light

May 14, 2024

What’s Getting Lost in the AI Shuffle

What’s Getting Lost in the AI Shuffle

The Case for Teachers. Real Ones.

The Case for Teachers. Real Ones.

robot teacher replace human
robot teacher replace human
robot teacher replace human

If you feel like you’ve been hearing a lot about new AI technologies lately, you’re not alone. Social media and news feeds are filled with breathtaking claims, quippy marketing language, and eye-catching videos–all in an effort to get us on board with the latest and greatest technological marvel to arrive on the scene.


The new AI scenarios being demoed are focused on tangible, real-world situations, which has long been the goal of advanced technologies, especially AI: trying to find specific ways to use generative AI to perform tasks that we find annoying. 


For example, Google is introducing a virtual assistant that can scan your email account, find the ones with receipts, extract them, create a spreadsheet, and categorize the totals. It’s easy to see the time-saving benefits of such a tool, assuming it didn’t make errors (that’s a different essay). This and other examples floating around the internet right now that highlight the benefits of this evolving generative AI have something in common: the goal of completing a task that has a clearly defined beginning, middle, and end. In other words, “Perform X job.”  Cool and convenient, yes, but very defined.


Just like literally any other time a new technology has been introduced throughout the course of history, there will be hand-wringers and tin hatters, naysayers and dooms-dayers today too. 


In fact, it’s a sign of our humanity that we’re seeing varied reactions. 


People aren’t all the same. We have different ideas about how the world should operate and what technologies can help us do the things that are important to us. Creating new technologies to support our efforts isn’t unprecedented—decades ago, the calculator took over complex manual calculations and algorithms and freed up human minds to do other things. This is not a new problem—we’ve always been doing this.


But when we apply generative AI and advancing technologies to education as a whole, we encounter a much more complex problem. 


Up until now, the fundamental strategy for humans and tech has been to “program tech to do things we already know how to do but do it faster and/or more reliably.” But the human brain cannot (at this point in history, at least) work “faster” – our biology is our limitation. 


With today’s AI technologies, we still have to identify what we want the new tech or AI to be able to do and tell it to do that thing. To have a Google AI Assistant categorize receipts, we have to know first how to run a business, how to balance books, understand finances, have budgeting goals, and how to ask for help. Then, we have to know what to ask the AI to do. 

Unsurprisingly, tech companies are dreaming up ways to use AI to “teach” kids. Just this week we’ve seen videos from OpenAI and Google showing AI “assistants” explaining math concepts, responding to questions, and summarizing lectures. School districts are investing in student chatbot “buddies” to help them with schoolwork. 


These might seem like tremendous advancements for learning, but I fear they come with serious risks that technologists do not - or are not willing - to see. (It’s a lucrative industry; tech companies are profit-driven. That is a math equation we can do without a calculator.)


And it is no secret that education in the United States is in crisis. Public schools face falling enrollment, record budget deficits, and worsening outcomes. As a result, teachers are overwhelmed (and underpaid, IMHO). 


Understandably, even if short-sightedly, schools are looking for quick fixes to solve these problems. When we wave the pixie dust of “AI” and “Ed Tech” in front of them, many districts come to the conclusion that they can make their shortages a thing of the past by simply hiring “AI assistants” to help kids learn. Such technological support will let districts hire fewer teachers (thus “saving” money), cut operating costs, and magically (and “quickly”!) improve education for kids. 


We cannot allow our schools to go down this path.


“Education” is not a technical problem that can be solved with a technical solution. There are aspects of education where a technical solution might make things easier (like creating student schedules for a school of 1,000 kids, for example). But the act of learning isn’t something we can make better or more convenient simply by speeding up the process. We can’t just displace teachers and hope children “absorb” information digitally instead. 


Districts that abandon teachers in favor of tech will still pay millions of dollars in renewal fees and licensing. The argument that AI can save schools money is an illusion. 


A deepfake, if you will. 


The problem with applying AI and technology solutions to education broadly is that technologists don’t mind the idea that a teacher can be replaced by technology (it’s not a secret that a mantra in EdTech for years has been viewing the role of teachers as a “guide on the side” of the digital classroom).  It is easy to assess all the problems in education (and there are many– underfunded schools, staffing cuts, overworked and burned-out teachers, too-large class sizes, student mental health, and much more) and argue that technological solutions can “fix” everything. 

A technologist can view this as a technical problem, see AI as the solution, and jump to the devastating conclusion that we just don’t need as many teachers.

Of course, anyone who works in education or with children knows that this is a deeply wrong conclusion on so many levels. 

I’m not just saying this because I was a teacher, I love teachers, and we need teachers. 

I’m saying this because we need teachers now more than ever. 

Why? Because the type of learning that eventually takes us to places where we can invent things like AI and other technological revolutions requires several things to happen first:

  1. Technology claims to make our lives easier, but learning happens in moments of friction. We don’t learn new things because they are easy; we learn new things when we struggle, persevere, and move through moments of difficulty to a new concept or solution. Technology in classrooms paradoxically makes learning “too easy” and intellectual laziness serves no one.

  2. Technology might help us explain complex concepts, but making real-world connections about our humanity requires the human mind. One of the most vital aspects of learning is the ability to make connections across disciplines. To read a philosophy text and connect it to a math assignment, an ancient poem, or a current news story. AI can tell us what these connections are, but learning and understanding the meaning and significance for ourselves is not the same as knowing what they are. Information is different from knowledge. True learning occurs when our minds open up possibilities and new ways of thinking or seeing the world. Intellectual curiosity and connection lead to new ideas. New neural pathways are formed. The complexity of what it means to be human is grappled with. AI can superficially mimic this, but seeing it does not mean we understand it. 

  3. Learning happens in the context of relationships. We all have stories of a teacher who changed our lives in some way. Learning in a relationship with another person is imperative because humans are inherently social creatures and we depend on one another for survival in more ways than one. When we try to learn “alone” we are not challenged to think more broadly, etc. AI won’t solve this– it will only draw conclusions that have already been drawn– because it looks at things that real humans already created or wrote about or said and uses that to make an inference about what you are trying to learn. It is not a new way of thinking and connecting ideas. And of course, as anyone who works with children knows, children are not standardized. A human teacher learns about the quirks and personalities of their students, to know what strategies and words will best help them, to understand how hard to push and when to pull back. AI may offer this superficially; superficiality is not the foundation of a healthy relationship. 


Phew. This is a lot. Heady stuff. But I won’t ever present something this big without offering you some solutions. 

What can we do?

  1. Fight for the Teaching Profession. More than ever, we need to stand up for the teaching profession. Yes, we still need to fight for smaller class sizes. Yes, we need to advocate for much better pay. But to that list, we also need to advocate for teachers themselves–for their skills as humans who know how to work with, guide, mentor, and teach children. 

  2. Say No to EdTech and AI in schools. I’ve been beating this drum steadily for a while, and I believe more now than ever in the importance of parents speaking up and saying No to all the EdTech. As one of many concerning data points, schools average 150 unique EdTech platforms per school. Even more startling? A staggering 96% of applications sell our children’s data to third parties. Without consent. And it is not, for anyone, developmentally needed or necessary to rely on 150 distinct platforms to “learn.” Children need 150 classmates and teachers. Not apps. How can parents do this? By opting out of the tech for school. Your child was likely opted in without your permission and certainly without a full understanding of what was being consented to. Now it’s time to say, “No thanks. We’re opting out.” I have a letter you can use as a template here.

  3. Be Skeptical, But Curious. Not all tech is evil. There are likely some transformational ways that AI can revolutionize our world. But a healthy dose of skepticism must be present as the current hype settles. Be wary of the hype–once the pixie dust settles, the problems will emerge. Technology, like humans, after all, is constantly evolving. 

  4. Return to What We Know Works: Paper, Pencil, People, Nature. Decades of research about child development and learning have established best practices. Tactile, hands-on, three-dimensional tools, time outside and in free play with friends– these are the things children need to learn, grow, and develop into competent, thriving humans. And yes, eventually, who will become humans who develop more complex technologies some day? Technologists send their own children to Waldorf and nature-based schools. We should all demand that for all our children. 

All technologists today—yes, even the Zuckerbergs and Gates of the world—had teachers who guided, inspired, and supported them. Bill Gates didn’t sit in his parents' garage tinkering with computer parts just because someone told him to. He did it because he was interested in learning about it, curious, and wondering. So he tinkered.  

While everything seems to have changed in the world today, in terms of how children learn, nothing has changed. Children today are still naturally curious. They wonder about the world. They ask questions constantly. They learn by getting messy and dirty and in their interactions with others, both peers and mentors.

And the people who nurture and shape those curious children are their teachers. 

We don’t learn because something is easy. We learn because something is sticky, tricky, or challenging. We learn in moments of friction– when we have to struggle with an idea or a concept. We learn in the context of a relationship with peers, teachers, and parents. And in moments of struggle in a classroom, we need more than just a bot to sit beside us, put a hand on our shoulder, and say, “Hey! I know this is hard! I know you can do it because I know you as a person. I know what you’re capable of.”


Technology may erode friction, but it will also erode the opportunity for learning. A teacher’s role is far more than the “imparter of knowledge.”


We need dreamers, voyagers, and discoverers. We also need nurturers, painters, and poets. And yes, we need mathematicians, scientists, and technologists. 

And all these people, all these humans with their varied experiences and nuances and complexities, come to be through the deep relationships they form with great teachers.

Think what you will about the good or the evils of tech (and there are many of both). But don’t forget about the teachers.

The Screentime Consultant Logo Footer image

Emily Cherkin’s mission is to empower parents to better understand and balance family screentime by building a Tech-Intentional™ movement.

Copyright © 2024 The Screentime Consultant, LLC | All Rights Reserved. | Tech-Intentional™

and The Screentime Consultant, LLC™ are registered trademarks.

The Screentime Consultant Logo Footer image

Emily Cherkin’s mission is to empower parents to better understand and balance family screentime by building a Tech-Intentional™ movement.

Copyright © 2024 The Screentime Consultant, LLC | All Rights Reserved. | Tech-Intentional™

and The Screentime Consultant, LLC™

are registered trademarks.

The Screentime Consultant Logo Footer image

Emily Cherkin’s mission is to empower parents to better understand and balance family screentime by building a Tech-Intentional™ movement.

Copyright © 2024 The Screentime Consultant, LLC | All Rights Reserved. | Tech-Intentional™

and The Screentime Consultant, LLC™ are registered trademarks.