Roblox now requires age checks for all users to access chat worldwide

Users must complete facial age checks or ID verification before using chat on Roblox

Last updated:
Nathaniel Lacsina, Senior Web Editor
2 MIN READ
New policy gates chat behind age verification to boost safety and prevent adult-minor communication.
New policy gates chat behind age verification to boost safety and prevent adult-minor communication.
AP

Roblox, the immersive online gaming and creation platform with tens of millions of users worldwide, has introduced a global policy requiring all users to complete an age check before they can access chat features on the service. The change, which began rolling out in early January 2026, marks the first time a major gaming platform has tied basic social interaction tools directly to verified age information.

The requirement applies wherever Roblox’s chat system is available and follows phased testing in countries including Australia, New Zealand and the Netherlands beginning in December 2025. Users must complete the age check — typically via a facial age estimation process conducted through the Roblox app — before chat features such as text or voice communication become accessible.

Under the system, users are placed into age groups once verified: under 9, 9–12, 13–15, 16–17, 18–20 and 21+. These groupings determine who users can message; for example, younger players are limited to chatting with age-appropriate peers. By default, chat remains off for children under nine unless a parent provides consent after verification.

Roblox says its age check combines on-device facial estimation with other options such as government ID verification for users aged 13 and over, and that images and videos used in age checks are deleted immediately after processing to protect privacy. Users with disputes over their estimated age can appeal or opt for alternative verification methods, including parental controls.

The policy update arrives amid heightened scrutiny of children’s safety on online platforms. Roblox has previously faced criticisms from parents and legal complaints over the risks of unfiltered contact between adults and minors on its network. Requiring age checks is part of a broader safety initiative that includes stricter communications rules, content moderation enhancements and expanded parental tools.

For users and developers, the shift could have lasting implications for how social features are used and moderated across the platform, especially in communities with mixed age ranges.

Sign up for the Daily Briefing

Get the latest news and updates straight to your inbox

Up Next