Mandatory chat verification exposes technical flaws as GCC scrutiny of child safety grows

Dubai: When Roblox began requiring all users worldwide to verify their age before accessing chat features earlier this year, the change was framed as a decisive step to strengthen child safety on one of the world’s most popular gaming platforms.
For parents in the UAE and across the Gulf, the announcement appeared to align with regional efforts to tighten online protections for children. Instead, the rollout has exposed technical flaws, raised privacy concerns and drawn attention to gaps that critics say could undermine the very safeguards Roblox set out to build.
The controversy has renewed scrutiny of Roblox at a time when the company is already facing lawsuits, regulatory investigations and a growing list of countries willing to block or restrict access to the platform over child safety concerns.
Roblox’s move toward mandatory age verification did not come in isolation. The company has spent years under mounting pressure from regulators, lawmakers and courts, particularly in the United States and Europe, over allegations that it failed to adequately protect minors from grooming and sexual exploitation.
That pressure has been especially visible in the Middle East. Since late 2025, Roblox has restricted features across several countries, including the UAE and Saudi Arabia, where in-game chat was temporarily suspended while moderation systems and content controls were reviewed in coordination with authorities.
Other countries in the region — including Qatar, Kuwait, Oman, Jordan and Turkey — have imposed partial or full bans at various points, citing concerns that the platform exposed children to harmful or inappropriate material.
Beyond the Middle East, governments elsewhere have taken similarly firm action. Russia shut down access to Roblox in December 2025, with communications regulator Roskomnadzor saying the platform contained content that could negatively affect the “spiritual and moral development” of children. Russian state media reported that the regulator had repeatedly warned Roblox since 2019 to restrict access to prohibited material. The block cut off millions of users in a market where Roblox had previously reported more than two million daily active users.
Most recently, Egypt joined the list of countries blocking Roblox, with the Supreme Council for Media Regulation saying the move was aimed at limiting children’s exposure to harmful online material. Roblox said it had reached out to Egyptian authorities to seek dialogue and restore access, adding that it has worked with regulators in other countries to adapt safety features to local values.
For families in the UAE, these developments underscore that Roblox’s global safety overhaul is unfolding against a backdrop of increasing scepticism from governments worldwide.
Under Roblox’s new rules, users must complete an age check before accessing text or voice chat. The platform relies primarily on facial age estimation technology that analyses a short selfie video to estimate a user’s age and place them into age brackets that determine who they can communicate with.
Users aged 13 and above can instead verify their age by uploading government-issued identification, while children under nine cannot use chat unless a parent explicitly enables it after verification.
Roblox told TechCrunch that the system is intended to create age-appropriate communication spaces while preserving social interaction, adding that tens of millions of users had already completed age checks during earlier pilot programmes.
Independent reporting suggests the system’s execution has been uneven.
In a detailed investigation, Wired documented multiple cases in which Roblox’s AI incorrectly estimated users’ ages, sometimes placing adults into teenage categories and, in other instances, classifying children as adults.
One adult user told Wired that the system identified him as being between 16 and 17 years old. “I don’t want to be chatting with children,” he said. Another user reported being labelled 13 to 15 years old despite being an adult.
Wired also cited instances in which children were placed into over-18 brackets, undermining the very purpose of age-based safeguards. In testing, some users were able to pass age checks using drawings or non-human images, further raising doubts about the reliability of the technology.
Perhaps most concerning for parents, the investigation found that age-verified Roblox accounts were being sold online, including accounts categorised as belonging to children. Although those listings were later removed after being flagged, the episode highlighted how easily safeguards could be bypassed.
Roblox executives have acknowledged that the system is not flawless.
Matt Kaufman, Roblox’s chief safety officer, told Wired that building age-verification technology at global scale presents significant challenges. “You can’t flip a switch while building something that hasn’t existed before,” he said, adding that the company expected the system to evolve over time rather than work perfectly from launch.
Kaufman said Roblox believes most users value safer communication tools, though critics argue that persistent misclassification risks eroding trust among parents and regulators alike.
Even where the technology works as intended, privacy remains a concern for many families in the UAE and across the GCC.
Roblox says facial images are processed by a third-party provider and deleted immediately after analysis. Still, some parents remain uneasy about biometric data being collected as a condition for accessing social features on a platform used largely by children.
These concerns reflect broader regional debates around digital safety, data protection and the responsibilities of global technology companies operating in local markets.
Some online safety experts say age estimation tools can play a role if implemented carefully. Stephen Balkam, chief executive of the Family Online Safety Institute, has described proactive age-checking technologies as an important step toward safer online environments for children and teenagers.
Legal advocates, however, remain cautious. Lawyers involved in child-safety litigation against Roblox have characterised the age-verification rollout as the company’s most aggressive shift so far, while questioning whether it meaningfully addresses the underlying risks rather than responding to regulatory pressure.
Roblox continues to attract millions of young users worldwide, including in the UAE, where it remains popular for its blend of gaming, creativity and social interaction. Yet its growing list of restrictions, bans and technical shortcomings complicates the decision-making process for families.
For parents, the issue is no longer simply whether Roblox offers parental controls, but whether the systems meant to enforce them are reliable enough to justify confidence in the platform.
As regulators across the Middle East and beyond continue to scrutinise digital platforms, Roblox’s experience highlights a broader challenge facing the industry: safety technologies may signal intent, but when they fall short in practice, they risk deepening the very concerns they were designed to resolve.
For now, UAE parents are left navigating a platform caught between global expansion and unresolved questions about whether artificial intelligence can reliably distinguish adults from children — and whether that distinction alone is enough to keep young users safe online.