Roblox Enhances Child Safety with Facial Verification and Age-Based Chats

This article was generated by AI and cites original sources.

Roblox, the popular online platform, has recently introduced mandatory facial verification for chat access, with 45% of daily active users having completed age checks by January 31. This move comes in response to concerns over child safety, including issues of grooming and exposure to explicit content.

To undergo the age-check process, users must use the Roblox app, grant camera access, and follow on-screen instructions for facial verification. The verification, managed by a third-party vendor called Persona, ensures user privacy by deleting images and videos post-verification.

Upon successful age verification, users gain access to age-based chats, enabling communication only within specific age groups. With six age categories ranging from under 9 to 21 plus, users can interact with those in adjacent age groups, fostering a safer online environment.

Roblox’s Q4 2025 earnings report revealed that age-checked data indicates a younger user demographic than self-reported data, with 35% under 13, 38% aged 13-17, and 27% over 18.

Concerns regarding the new process emerged during implementation, prompting the platform to offer avenues for users to appeal age-check results through alternative verification methods or parental controls.

Source: TechCrunch