Roblox’s Age Group System & The Death of Useful Chat
This site documents how Roblox’s new age-check + age-group chat rules are breaking communication for players, developers, and roleplay communities — and places that in the context of lawsuits, safety failures, and the Schlep controversy.
Roblox now requires facial age estimation or ID for chat. Players are split into age brackets (like 13–15, 16–17, 18–20, 21+). Chat is limited to “same or similar” age groups.
- Unverified? You can lose access to chat entirely.
- Roleplay games? Many rely on cross-age text chat to function.
- Privacy? Critics worry about handing selfies/ID to third parties.
What actually changed?
Roblox’s New Age Groups & Chat Lockdown
Roblox is rolling out mandatory age-checks (facial age estimation or ID) tied to new age groups:
How it works (officially)
- Complete an age-check → your account is placed in a group.
- You can chat only with people in your own or “similar” age groups.
- In early rollout (Australia, NZ, Netherlands), this applies “within these regions” first, then globally.
The idea is to reduce direct contact between adults and minors.
What players actually feel
- Chat is more restricted, even in normal games.
- People who refuse to verify (privacy or safety reasons) lose the ability to talk.
- Devs running roleplay / social / trading games are left with silent servers.
It solves some safety problems by making communication itself extremely hard.
Impact on the platform
Why so many players think chat is “useless” now
Since the announcement, developers, roleplay group owners, and regular players have flooded social media with reactions like:
Why it hits big games so hard
- Roleplay and social games depend on free chat.
- Cross-age groups (e.g. 15-year-old & 17-year-old siblings) can get blocked by strict age bands.
- Many players use in-game chat instead of external apps — that channel is now heavily filtered or turned off.
Privacy & trust issues
- Not everyone wants to send a selfie/ID to a third party.
- People worry about data leaks, misuse, and face data being stored or reused.
- Instead of fixing moderation, it feels like players are being scanned to continue talking.
The result: yes, predators may have a harder time reaching kids via in-game chat — but innocent players, devs, and communities lose a core part of what makes Roblox social.
Context: safety & backlash
The Schlep Controversy
In 2025, Roblox permanently banned Schlep, a Texas-based YouTuber who posed as young players on Roblox to expose alleged predators and worked with law enforcement in sting-style operations. Roblox also sent him a cease-and-desist letter threatening legal action if he continued his activities on the platform.
The company argued that “vigilante” operations:
- Violate its rules and user privacy.
- Can interfere with real police investigations.
- Should be replaced by reporting through official channels.
Many players and creators saw it differently. The ban sparked the #FreeSchlep movement, in-game protests, and widespread criticism that Roblox was targeting someone trying to catch predators while the platform itself still struggled to protect kids.
Why this matters for the age-check debate
The Schlep incident became a symbol of what critics see as a pattern:
- Aggressively controlling image and PR around safety.
- Cracking down on outspoken community members instead of focusing on deeper moderation issues.
- Rolling out heavy systems (like age checks and age bands) that punish regular players while predators still find ways in.
Legal & media fallout
After the ban, Schlep announced plans to countersue Roblox, claiming the platform enabled abuse for years and retaliated when he exposed predators. At the same time, multiple lawsuits from U.S. states and families accused Roblox of failing to keep children safe.
This controversy is one of the reasons Roblox is now under intense legal and political pressure — which directly feeds into the push for mandatory age-checks and strict chat limits.
Legal pressure
Lawsuits & governments vs Roblox
Roblox is facing a wave of lawsuits and government actions accusing it of not doing enough to protect children from grooming, sexual content, and exploitation on the platform.
Examples of legal action
- U.S. state attorneys general have sued Roblox, calling it a “perfect place for predators” and claiming its systems failed to prevent exploitation.
- Parents have filed lawsuits after alleged grooming cases involving Roblox chat and experiences with sexual content.
- Internationally, some countries have banned or heavily restricted Roblox over concerns about harmful content.
How Roblox responded
- Publicly promoting new safety features, including facial age checks and narrower age-group chat.
- Talking about hundreds of “safety initiatives” and increased moderation tools.
- Emphasising compliance with local laws and child safety commitments in press statements.
Critics argue these moves are more about legal defense and PR than truly fixing underlying safety and moderation problems.
Leadership & accountability
Roblox Leadership Under Fire Over Safety & Chat
Roblox’s CEO and security leadership have been heavily criticised in interviews, articles and opinion pieces for downplaying predator risks and being slow to fix problems that players, parents, and watchdogs have been shouting about for years.
Some of the main complaints from the community and commentators:
- Public comments that sounded too optimistic about existing safety systems while lawsuits describe serious harm.
- Decisions like banning Schlep, which many saw as attacking a vocal critic instead of embracing external help in catching predators.
- Choosing intrusive solutions (scanning faces, strict age bands) instead of deeply fixing moderation, discovery, and reporting tools.
This site doesn’t claim to know the intentions of any Roblox executive — but based on public reporting, many people believe that corporate image and investor comfort are being prioritised over practical, player-friendly safety tools.
For your own receipts
Screenshots, Clips & Evidence Hub
Roblox announcing the new age verification method.
Suggestions
How Roblox Could Fix This Without Killing Chat
You can edit this section with your own suggestions. For example:
- Allow limited in-game text chat for unverified users (no DMs, no links, stronger filters) so games can still function.
- Give developers tools and options instead of forcing a single strict chat model on every experience.
- Offer better parental controls and transparency so families can decide, instead of scanning every player’s face to talk.
Right now, it feels like Roblox is trying to solve real problems with a system that makes the platform less fun, less social, and pushes conversations off-platform instead of making them safer.