What's Going On With Roblox?
- Soni Albright

- 13 hours ago
- 9 min read
Updated: 11 hours ago
Age-Gating, History, and a Guide for Parents & Educators

What is Roblox
Roblox is one of the world’s largest online gaming platforms, with more than 200 million monthly users, many of them children under 13. Unlike traditional games, Roblox Corporation doesn’t create most of the games; instead, it provides the platform and tools for users (including minors) to build “experiences” that other players join (user-generated content). This user-generated structure allows for enormous creativity, but it also creates inconsistent levels of safety, moderation, and design oversight, especially in the popular social and chat features that connect players of all ages.
Age-Gating in Roblox
Roblox’s upcoming (slated for a January roll-out) “age-gated chat” system marks a major shift in how young users access social features on the platform. To use chat, kids will soon need to verify their age, either by scanning their face through the Roblox app or uploading a government-issued ID. Roblox says these images are processed by a third-party vendor and deleted immediately after verification.
On paper, this sounds promising. In practice, we have seen this story before.
Side note: In Australia’s recent ban on social media for users under 16, Roblox was surprisingly exempt. Regulators stated they were satisfied—for now—with Roblox’s upcoming age-verification changes and also didn’t include gaming platforms in the ban, noting they could revisit the decision later. Given Roblox’s shaky safety record, I have questions.
Roblox’s Track Record: Why Some Parents and Safety Advocates Are Wary
Roblox’s safety record has become a significant point of concern for parents, child safety experts, and some tech-savvy families. In 2025 alone, nearly 80 lawsuits alleging that Roblox’s design enabled child sexual exploitation and grooming were centralized in federal court, with plaintiffs claiming the platform failed to protect young users or warn parents of risks despite marketing itself as family-friendly.
Multiple state attorneys general, including Texas and Louisiana, have sued Roblox, accusing the company of ignoring safety laws and allowing predators and harmful content to proliferate in environments frequented by children.
Families in several jurisdictions have reported cases where predators allegedly used Roblox to contact or groom minors, sometimes moving conversations off-platform to apps like Discord before abuse escalated. In addition, controversies have arisen around how Roblox responds to independent efforts to expose predatory behavior on the platform, such as the removal of a YouTuber known for documenting such incidents, which drew criticism and public backlash.
Though Roblox states that it uses filters, moderation, and law-enforcement collaboration to keep users safe, critics argue that current measures are insufficient given the frequency and severity of documented allegations, leading many informed parents to approach the platform with caution or to seek alternatives for younger children.
Biometrics: A Step Forward or a Familiar Risk?
Age-gating technology indeed keeps getting better, and, if implemented well, could help reduce the number of adults interacting with young children online.
However, accuracy remains a question, especially across different age groups, skin tones, and devices. System spoofing will also remain a challenge. And even if the technology works flawlessly, biometric systems carry one unavoidable truth:
You can get a new credit card, but you cannot get new biometric markers.
We don’t have to look far for a cautionary tale. In 2024, Discord rolled out age verification that required users, including teens, to upload IDs despite repeated assurances that all images would be “immediately deleted.” A third-party vendor suffered a major breach. Those IDs, including children’s IDs, were exposed. Those families now live with that risk indefinitely.
Roblox is making similar promises today that their age-verification process will be fast, secure, deleted immediately, and never sold.
This may be true. But parents should know that the industry’s track record (including, but not limited to, Roblox) is uneven at best (source, source).
The Hard Fork Interview: A Lesson in How NOT to Instill Confidence
The recent New York Times Hard Fork podcast interview with Roblox CEO David Baszucki made these concerns even more urgent. When asked about Roblox’s long-standing child-safety issues, especially the risks of open chat for very young users, the CEO repeatedly dodged questions, minimized responsibility, and insisted the company was “managing things well.” He framed safety failures as a byproduct of scale, pointing to other tech companies’ struggles as though this absolved Roblox from proactive responsibility.
He also refused to acknowledge that open chat may simply be inappropriate for children still developing emotional regulation, social-cue recognition, and online boundaries.
But the moment that shocked many listeners, including me, was his openness to introducing prediction-market elements (i.e., gambling mechanics) into Roblox’s ecosystem. For a platform primarily used by 6–12-year-olds, this suggestion revealed a profound misalignment between corporate priorities and the realities of child development.
Instead of inspiring confidence, the interview raised even more questions about Roblox’s judgment.
So what should parents do with this information?
I’m a parent. I have four teenagers who see their online life as synonymous with their offline one. The struggle is real, and no matter how much I think I know, there are always new considerations in this complicated landscape of internet safety for young people.
This is where it gets personal, and families will make different choices:
For me, biometric verification is where we draw the line.If a platform requires my child to hand over their face or ID to access chat, that is the moment we look for other options.
For other families, especially those navigating older tweens and teens who are increasingly social online, the calculation may be different. There isn’t one “correct” approach, just informed choices.
What Parents Can Do Instead
Parental controls shouldn’t require a degree in IT. Between Roblox’s confusing settings (they are not alone in confounding parental controls) and the fragmentation of safety tools across different platforms, managing a child’s digital life is a massive time commitment that many families simply cannot afford. Dig into the controls if you can, and don’t beat yourself up if you can’t.
Here are some realistic, nonjudgmental alternatives that have worked in my own home or for families I support.
1. Use Your Router as the Primary Parental Controls
For families who want strong safeguards with minimal day-to-day hassle, your router is one of the most effective tools you already own. Modern routers allow you to manage internet access for every device on your home network from one place. Through the router’s admin panel or companion app, you can set internet “curfews,” block adult content, enforce SafeSearch, limit access to certain websites or apps, and monitor which devices are online. This creates a centralized layer of protection that applies across gaming consoles, tablets, phones, smart TVs - literally everything connected in your home.
If your child uses Roblox, you’ll still need to manage in-game settings like chat or purchases yourself in the Roblox parental controls - your router can’t control those. But a router can block Roblox entirely, set time limits, or restrict when devices can go online, giving you an additional layer of control.
It’s not perfect, but learning your router settings offers one of the highest-return safety steps you can take, making your home’s digital environment safer and more predictable for kids. And for those saying, “But kids can get around parental controls…” yep, I know. Jump ahead to #5.
2. Use Safer Platforms or Private Spaces When Possible
One thing that worked well for us was setting up a whitelisted, private Minecraft server.
It costs $8/month (though there are more expensive options) and allows my child to invite only their friends. Roblox has this option, too, as do other similar social games.
There are limits, of course, especially as kids get older and want to participate in larger multiplayer worlds. But for the elementary years, this option was a (ahem) game-changer.
3. Keep Devices in Shared Areas
When my kids play online games with open chat features (Minecraft realms, older-kid group play, MMOG, etc.), we keep the computer facing the living room and kitchen, where life is happening. We still have certain platforms restricted (STEAM, for example) to the family area computer.
Not in a “hovering over them” way, just integrated into our home ecosystem.
This does two important things:
It increases natural supervision and encourages appropriate online behavior.
It gives us opportunities for casual conversation: “Oh wow, what are you building?” or “Who are you playing with?”
Kids talk more when they feel invited, not interrogated. I also learn a lot!
4. Play the Game With Them (Or Ask Someone Who Can!)
If you’re unsure about Roblox or any game, play it yourself or, better yet, with your child. Seeing the gameplay helps you understand what the controls do and how changes affect your child’s experience.
And if you don’t have the time or desire? That’s okay. Ask someone in your circle, like another young adult, sibling, a babysitter, a family friend, etc., to play a session with your child and report back.
I did this once and was surprised by how much the young person noticed right away, including that my child had accidentally shared personal information in the chat by answering a question about where players were located.
Not every parent has time to comb through settings, but many have someone in their life who can help and would appreciate the opportunity to introduce someone to a game they love!
5. Media Literacy Education
Even the strongest parental controls can’t completely stop everything all the time. What will help is giving kids the language and tools they need to understand what’s happening. We know that media literacy education works (source, source, source), and we know that not enough kids (only about 34% in 2023-2024) have regular instruction in media literacy in the U.S., but that they actually want more!
Here are the core media literacy skills that directly reduce online risk and can be incorporated into everyday conversations with the kids in your life:
Teach kids what personally identifiable information is.
The basic things that should not be shared online, under any circumstances, are your full name, home address, school name, phone number, passwords, and any photos or details that reveal where you live or spend time. Kids need clear examples so they can recognize when a seemingly harmless question is actually asking for personal information.
Discuss grooming behaviors openly.
Grooming is a deliberate process that typically follows recognizable stages: building trust, isolating the child, escalating boundaries, and creating secrecy or dependency. Kids don’t need graphic detail, but they do need to know that these patterns exist so they can spot them early and come to a trusted adult right away.
Explain sextortion. Not to scare them, but to inoculate them.
Sextortion among youth, especially preteens, is rising at an alarming rate, and some cases have resulted in devastating, even fatal outcomes. Kids need to hear clearly and often that they will never be in trouble for telling a trusted adult if someone pressures or threatens them online.
Talk about “people you or your family know in real life,” avoid the word “strangers.”
Use language like “people you or your family know in real life” instead of “strangers.” Once kids interact with someone online, they often stop thinking of that person as a stranger—something predators rely on. Talk openly about how adults and older teens can pretend to be any age online, and that anyone behind a screen can misrepresent who they are.
Use real news stories to build context.
Share age-appropriate news about platforms they actually use, like Roblox, Snapchat, Discord, and Instagram, and talk about what happened and why. Real examples help kids recognize similar red flags in their own online experiences.
Tell them: “If something happens online and you come to me for help, you will not be in trouble.”
Kids need to know they are safe coming forward. You can still teach and guide them later, but the first priority is keeping the lines of communication open because shame keeps kids silent, and silence keeps them unsafe.
Media literacy doesn’t eliminate risk, but it reduces vulnerability, and it empowers kids to recognize danger before it escalates. If your child’s school is not teaching media literacy as a regular part of the curriculum, talk to your teachers and administrators about implementing this crucial education! There are several options out there, but one clearly stands out as the best. Cyber Civics has (sometimes many) standalone lessons for each of the above topics…and then some.
Media literacy is most effective when it’s taught consistently throughout childhood and adolescence, both before and during a young person’s access to online life. Its impact is even stronger when paired with policies that put children’s needs first and require platforms to create safer, more developmentally appropriate environments.
6. Teach Kids to Expect More
One of the most powerful choices families can make is to remember that we don’t have to accept a platform’s design as inevitable. If a company’s safety practices or data policies feel out of alignment with your values, you are fully empowered to move your family’s time, attention, and money elsewhere. And make no mistake: if enough families walk away, companies like Roblox will respond. Nothing motivates a platform to prioritize safety faster than declining engagement.
Right now, Roblox has little incentive to change. The company is accountable to shareholders, not to the children and families who make up most of its user base. But it shouldn’t be a given that participating in digital life means sacrificing privacy or assuming unnecessary risk. The responsibility for creating ethical, protective, privacy-forward environments should rest primarily on the platforms, not on parents scrambling to retrofit safety after the fact.
This is something we can model for and teach our kids: that technology can and should be safer, and that they deserve platforms designed with their well-being at the center.
As a parent and media literacy educator, I have additional questions about Roblox’s practices and plan to explore them further in a future post, including its predatory monetization model and the potential expansion of prediction-market (gambling) components. Until then, stay tuned!

Soni Albright is a teacher, parent educator, curriculum specialist, researcher, and writer for Cyber Civics with nearly 24 years of experience in education. She has taught the Cyber Civics curriculum for 14 years and currently works directly with students while also supporting families and educators. Her experience spans a wide range of school settings—including Waldorf, Montessori, public, charter, and homeschool co-ops. Soni regularly leads professional development workshops and is passionate about helping schools build thoughtful, age-appropriate digital literacy programs. Please visit: https://www.cybercivics.com/parent-presentations



