top of page

AI Trailblazer Google Doesn’t Want Schools to ‘Bypass the Human’

Updated: 1 hour ago

In an Attempt to Combat Cheating and Plagiarism, the Search Giant is Stressing the Importance of Teachers


Eamonn Fitzmaurice/The 74, Jacques-Louis David

This article was originally published by The 74, a nonprofit news organization covering education in America. It is republished here with permission. The original article *by Greg Toppo can be found here: https://www.the74million.org/article/ai-trailblazer-google-doesnt-want-schools-to-bypass-the-human/


The Role of Teachers in Digital Learning


In 1999, Indian computer scientist and educational theorist Sugata Mitra launched an audacious learning experiment. He and his colleagues at the National Institute of Information Technology cut a hole in a street-level wall of their New Delhi office building. They mounted an Internet-connected personal computer, accessible to anyone passing by. There were no instructions, suggestions, or lesson plans—just access.


Within hours, children from a nearby slum appeared and glued themselves to the computer. They learned how to use the mouse, download games and music, play videos, and surf the web—all by teaching themselves.


Mitra called this “minimally invasive education.” The experiment was replicated worldwide and became hugely influential in the ed tech world. It showed that children simply need access to tools to succeed.


Dr Sugata Mitra in front of his ‘hole in the wall’ experiment.
Dr Sugata Mitra in front of his ‘hole in the wall’ experiment.

But don’t mention Mitra too enthusiastically to Ben Gomes, the computer scientist who co-leads Google’s education efforts. While the “hole in the wall” experiment is a hopeful story, he believes it misses a key element: teachers.



"People are fundamental in the learning process. People learn from other people, and people learn because of other people."

Ben Gomes, Google


Ben Gomes

The Importance of Pedagogy


“We are paying attention to pedagogy, and we’re working with the teachers,” Gomes said. “We’re not saying we just want a thousand flowers to bloom randomly.”


As AI becomes more common in schools, Gomes insists that Google has a duty to train teachers. It’s not just about using products; it’s about helping students transition from taking shortcuts to using AI for deeper, independent learning.


This strategy could address longstanding complaints that ed tech is focused on replacing teachers with tools that don’t measure up. “It’s a belief backed by science that people are fundamental in the learning process,” Gomes said. “People learn from other people, and people learn because of other people.”


Children can learn independently, but deep understanding and literacy require guidance. This is especially true now, nearly three decades after Mitra’s experiment, as many developers seek to replace teachers with AI.


“Teachers are critical in this process,” Gomes emphasized. “We don’t want to bypass the human.”


AI as a Thought Partner


In a recent white paper, Gomes and colleagues explored how AI could reverse declining global learning. They propose supporting teachers and enhancing personalization. In mid-January, Google announced it was doubling down on AI in the classroom. They are offering their AI-driven Gemini app to more educators and students for free. This includes tools like full-length practice SATs and a partnership with Khan Academy to power a writing coach tool.


The search giant has appointed a former NASA trainer to lead much of this effort. Julia Wilkowski, a neuroscientist, has also taught sixth-grade math and science. She began her career at an outdoor environmental school, where she recalls hiking trips. During these trips, she would ask students to figure out the velocity of a stream using only an orange, a length of string, and a stopwatch.


Wilkowski now spends “pretty much 100% of my time” ensuring that Google’s AI for students is based on sound learning science.


In interviews, Gomes and Wilkowski discussed their work, often admitting that much of it is about helping teachers find ways to get students to stop outsourcing their thinking.


“Teachers have the opportunity to teach their students how to use these tools ethically and effectively that don’t bypass those critical thinking skills,” Wilkowski said.


For example, she has worked with English teachers to help them instruct students on using AI as “a thought partner” in essay writing, not as the writer itself. These teachers have succeeded by breaking down essay writing into its components and openly discussing its goals. They use AI to help students brainstorm topics, refine thesis statements, generate first drafts, and offer feedback. This gives students “guidance and guardrails” without allowing them to submit AI-written essays.


The work, which started a year and a half ago, “has really informed my optimism about how AI can be used successfully,” she said.


Guided Learning


Both Wilkowski and Gomes frequently mention “guided learning.” They believe students learn best when they move beyond simple answers to develop their own ideas and think critically. To achieve this, teachers must guide them with carefully designed questions.



"There's no published research showing that GenAI chatbots have the pedagogical content knowledge to be effective Socratic tutors."

Amanda Bickerstaff, AI for Education


Amanda Bickerstaff


The Challenge of AI in Education


Perhaps unsurprisingly, Google has an app for guided learning. A section of Gemini acts like a private tutor or guide, offering students a taste of “productive struggle.” This engages and challenges them without providing immediate answers. Instead, it steers them toward answers through a series of questions.


Gomes explained that the principle is being integrated into most of Google’s AI products, including a newer one called Learn Your Way. This tool helps students learn topics interactively and in more appealing ways than traditional textbooks. It offers text with quizzes, narrated slideshows, audio lessons, and “mind maps” that connect related ideas graphically.


At its core, Gomes believes the dilemma over AI and cheating stems from motivation. “If I look back at my own childhood, there were times I was just interested in getting something done for tomorrow,” he reflected. “But there were also times when I was curious and wanted to read more.”


The balance between how much time students spend in one state versus the other varies. He believes that getting more people into a motivated state is the goal.


However, Amanda Bickerstaff, co-founder and CEO of AI for Education, argues that the reasons students turn to AI are “far more complicated than lack of motivation.”


Students face challenges like “perfectionism, high-stakes assessments that prioritize grades, and skill and language gaps.” She believes framing this primarily as a motivation issue oversimplifies what’s happening in classrooms.


Bickerstaff pointed out that Google’s shift toward Socratic reasoning “sounds promising,” but there’s a fundamental problem: “There’s no published research showing that GenAI chatbots have the pedagogical content knowledge to be effective Socratic tutors.”


She noted that chatbots are “sycophantic by nature,” often providing answers and completing tasks even when not explicitly asked. “That’s the opposite of productive struggle.”


Most young people also lack sufficient AI literacy to use these tools strategically. “Without that foundation, chatbots become an “easy” button for schoolwork rather than a learning tool. You can’t solve that problem through interface design alone.”


The Need for Better Feedback


Wilkowski believes much of the struggle over AI comes down to feedback: How much should students receive, how often, and what should it look like?


She shared a personal story about her daughter, who was required to write an essay for a final exam in December. When Wilkowski spoke to The 74 in early January, the essay still hadn’t been graded.


“I would rather have AI-generated feedback,” she said. “Give the first draft, and then the teacher can review it before giving it to the students.”



"Teachers have the opportunity to teach their students how to use these tools ethically and effectively that don't bypass those critical thinking skills."

Julia Wilkowski, Google


Julia Wilkowski


Rethinking Assessments


More broadly, Wilkowski believes AI could soon change how students are assessed altogether. It could help teachers move away from multiple-choice tests, which have known issues. While they are easy to create, administer, and grade, they allow students to guess rather than demonstrate understanding. They encourage rote memorization rather than deeper engagement with material.


Multiple-choice tests also fail to evaluate higher-order thinking skills, creativity, student writing, or the ability to construct arguments. If AI can simplify grading essays or long-form questions, wouldn’t that render multiple-choice tests obsolete?


“Imagine you’re in physics class studying acceleration versus time graphs and you ride your bike home,” Wilkowski said. “An AI tool might pop up and say, ‘Hey, here’s your acceleration versus time graph of your bike ride home. What did you notice about your velocity? How did it change as you changed acceleration? Was there a hill you had to overcome?’”


More relevant assignments and assessments could encourage students to think critically, integrating school into their real lives in deeper ways. “It goes back to what excited me as a teacher: those engaging, hands-on lessons. I see a way that AI can facilitate those in the future.”


Bickerstaff from AI for Education finds it encouraging to see Google working on more “fit-for-purpose tools” for students.


“The education sector desperately needs companies to move beyond general-purpose chatbots and build tools that actually support cognitive work rather than replace it,” she said. “But there’s still a lot of work to do—and a lot of research that needs to happen—before we can know if these tools are effective learning guides.”


Greg Toppo

This article was written by Greg Toppo and originally published by The 74. It is republished here with permission.


At Cyber Civics, we believe students deserve guidance—not just access—when it comes to technology. Our research-based curriculum helps schools teach critical thinking, empathy, and responsible decision-making in digital spaces.


Learn more about Cyber Civics here: https://www.cybercivics.com/

 
 
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • YouTube Social  Icon
bottom of page