Roblox has announced sweeping new safety measures aimed at limiting contact between adults and children on its platform by the end of 2025. The gaming site, which attracts more than 110 million daily users worldwide, will begin requiring stricter age verification for anyone using its chat and communication tools.
Currently, Roblox users over the age of 13 can freely communicate with others. Under the new rules, all players must complete an age check or estimation process to use text and voice chat. Those unable to verify they are 18 or older will only be able to interact with users they already know in real life.
Matt Kaufman, Chief Safety Officer, stated that the initiative aims at “bad actors” who try to take advantage of younger players by pretending to be children themselves. “This process gives us more accurate information about how old a user actually is, beyond what they report at sign-up,” Kaufman explained. “It helps ensure young users are protected and limits adult strangers from contacting them.”
The actions come in response to heightened examination of Roblox’s record regarding child safety.The company submitted over 24,000 reports to the US National Center for Missing and Exploited Children last year, while investigations have highlighted risks of children encountering predators and inappropriate content. In one case, vigilantes attempting to catch suspected offenders were themselves banned from the platform.
Roblox emphasized its commitment to safety in a company blog, noting that parents, creators, and legislators all share the goal of making the platform safer. As the $92 billion company continues to grow and partner with entertainment brands like Netflix and Sega, it is positioning the new system as a core step in protecting younger players.

