Photo credit: Copilot
Roblox has announced that it will block users under the age of 13 from sending messages on its platform as part of new efforts to safeguard children.
Starting from Monday, child users will not be able to send direct messages within games by default unless a verified parent or guardian grants them permission. These changes are set to be fully implemented by the end of March 2025.
Under the new rules, young children will still be able to access public conversations seen by everyone in games, allowing them to communicate with their friends publicly. However, they will not be able to engage in private conversations without parental consent.
Parents will have the ability to view and manage their child's account, including overseeing their list of online friends and setting daily limits on playtime.
Roblox's chief safety officer, Matt Kaufman, stated that the game is played by 88 million people each day, with over 10% of its total employees working on the platform's safety features.
"As our platform has grown in scale, we have always recognized that our approach to safety must evolve with it," Kaufman said.
Read more: Roblox introduces new safety measures for users under 13
In addition to banning children from sending direct messages across the platform, Roblox will provide parents with more tools to easily manage their child's activity.
Parents and guardians must verify their identity and age using a government-issued ID or a credit card to access these parental permissions via their own linked account.
Kaufman acknowledged that identity verification is a challenge faced by many tech companies and urged parents to ensure their children use their correct age when creating accounts.
"Our goal is to keep all users safe, no matter what age they are," he said. "We encourage parents to work with their kids to create accounts and ensure that they are using their accurate age when they sign up."
Richard Collard, associate head of policy for child safety online at UK children's charity the NSPCC, called the changes "a positive step in the right direction."
However, he emphasized the need for effective age verification methods to translate these changes into safer experiences for children. "Roblox must make this a priority to robustly tackle the harm taking place on their site and protect young children," Collard added.
Roblox also announced plans to simplify content descriptions on the platform by replacing age recommendations for certain games with "content labels" that outline the nature of the game.
This change will allow parents to make decisions based on their child's maturity rather than their age. The content labels range from "minimal," which might include occasional mild violence or fear, to "restricted," which could contain more mature content such as strong violence, language, or realistic blood.
By default, Roblox users under the age of nine will only be able to access "minimal" or "mild" experiences. However, parents can allow them to play "moderate" games by providing consent. Users will not be able to access "restricted" games until they are at least 17 years old and have verified their age using the platform's tools.
This follows an announcement in November that Roblox would bar users under 13 from "social hangouts," where players can communicate using text or voice messages, starting from Monday.
Additionally, Roblox game creators will be required to specify whether their games are suitable for children, blocking access for users under 13 to games that do not provide this information starting December 3.