AI Literacy Is Just Digital and Media Literacy in Disguise.
- Diana Graber
- Oct 9
- 4 min read

Key points
An executive order, Advancing Artificial Intelligence Education for American Youth, calls for AI literacy.
AI literacy is suddenly an educational priority in K-12 schools.
Media and digital literacy lay the foundation for AI literacy.
Critical thinking, ethical awareness, and responsible participation are essential to using AI wisely.
There’s a lot of excitement surrounding AI literacy right now, and it’s (finally) giving digital and media literacy their long-overdue moment in the sun.
In April 2025, President Trump signed an executive order, Advancing Artificial Intelligence Education for American Youth. The initiative is designed to promote AI literacy and proficiency among K-12 students and teachers, ensuring that America’s youth gain early exposure to AI and positioning the nation to remain a global leader in this transformative technology.
In other words, AI literacy has suddenly become a national education priority, and schools are scrambling to determine how to implement it.
But here’s the irony: We’ve never had an executive order for digital literacy (the safe and responsible use of technology) or media literacy (the ability to access, analyze, evaluate, create, and act using all forms of communication), even though AI literacy is essentially an extension of these literacies. In fact, you really can’t teach AI literacy without digital and media literacy as its foundation. The core competencies they develop—critical thinking, ethical awareness, and responsible participation—are essential to understanding AI and using it wisely.
It took AI—with its billions in investments, immense potential, and profound risks—to make policymakers finally pay attention to what has been right in front of us all along.
Unmistakable Parallels
While AI literacy feels urgent and innovative, it simply amplifies the same digital and media topics that should have already been national educational priorities:
1. Screen Time and Well-being
Teaching students how and why to balance their time with technology has always been a core component of digital literacy. Even as phones are being banned in schools across the U.S., teaching self-regulation and digital well-being is more critical than ever. New AI-powered tools are creating increasingly irresistible distractions, waiting for students when they get home from school.
In just the last few weeks, OpenAI released Sora 2, or what NPR calls “Deepfake TikTok,” while Meta released Vibes, an AI video feed where users can browse, remix, and share synthetic videos directly to social media. Both are poised to flood the Internet with even more addictive content. These, and whatever AI tools come next, demand the same psychological resilience and self-awareness that comprehensive digital literacy education develops.
2. Misinformation
Misinformation is nothing new, but generative AI has certainly supercharged it. AI-generated stories, videos, and images, quick and easy to produce, are literally everywhere online. Students with a strong foundation in media literacy already know how to evaluate such content by questioning sources, verifying authenticity, and recognizing emotional triggers. These skills become increasingly essential in an AI-fueled information landscape.
3. Digital Citizenship
Students today have unprecedented power, and with that power comes responsibility. Digital citizenship teaches them to think critically about what they post, make, and share, and to understand that their digital actions have a lasting impact on themselves and others. Unfortunately, the examples they see, even from the highest levels of government and public life, often model misinformation, bullying, and impulsive behavior. That’s why it’s more urgent than ever for students to learn how to use new technologies thoughtfully, truthfully, and compassionately.
4.Cyberbullying
AI tools are enabling new and increasingly harmful forms of cyberbullying. From using AI to make someone appear nude (known as a “deepnude”) to sextortion schemes that clone voices and manipulate images to extort unsuspecting victims, the risks are severe and psychologically damaging. Teaching empathy, respect, kindness, and responsibility, the cornerstones of digital citizenship, can help curb new forms of cyberbullying and make students vigilant in protecting themselves and their peers.
5. Privacy
For years, digital and media literacy curricula have taught students how and why social media platforms and websites collect and use personal data and why protecting their privacy matters. Those same lessons apply directly to AI. Students accustomed to questioning the motives behind data collection are better prepared to apply the same critical thinking to the AI systems they encounter.
6. Online Safety
The fundamentals of online safety haven’t changed: keeping alert to scams, protecting personal data, avoiding harmful interactions, thinking before sharing, and understanding the permanence of one’s online actions. AI has simply upped the stakes with AI chatbots, synthetic influencers, algorithmically generated “friends,” and more, now populating their digital world.
7. Visual Literacy
Examining visual clues to determine whether an image or video is real or not has always been core to media literacy. Today, visual literacy is a frontline defense when it comes to AI. Teaching students how images and video can be generated or manipulated to persuade, deceive, or evoke emotion is no longer optional; it’s a civic necessity.
This list just scratches the surface and leaves out equally important topics like copyright, bias, equity, and emotional well-being. These are all integral, and now impossible to ignore, topics students should be exploring in school.
Nothing New Under the Sun
The sudden spotlight on AI literacy may feel new and urgent, but it really shouldn’t. It simply amplifies the urgency of teaching what we should have been teaching all along—how to think critically, act ethically and safely, and participate responsibly in a digital world, with an added layer of understanding how AI systems work.
When we hand young people increasingly powerful tools without this education, they pay the price. We’ve already seen it with dangerous and sexually inappropriate interactions with chatbots, the spread of harmful AI-generated misinformation, and the rise of nudification and deepfake apps, just for starters.
If AI is what it takes to wake us up to the need for digital and media literacy, that’s okay—but let’s not wait for the next new tool to remind us again.