Video game maker Roblox tightens messaging rules
Roblox Introduces New Safety Measures to Protect Young Users
In a move to enhance child safety on its platform, popular video game maker Roblox has announced the rollout of new safety measures that will prevent users under the age of 13 from directly messaging others without parental consent.
With an impressive user base of around 89 million users last quarter, Roblox is taking proactive steps to empower parents and caregivers in managing their child’s Roblox account. This includes the ability to remotely monitor their child’s friend lists, set spending controls, and manage screen time.
The decision to implement these safety measures comes in response to concerns surrounding child abuse on the platform. In August, access to Roblox was blocked in Turkey following a court order due to concerns about user-generated content potentially leading to abuse.
Additionally, a lawsuit filed in San Francisco in 2022 alleged that Roblox played a role in the sexual and financial exploitation of a California girl by adult men. The lawsuit claimed that the platform encouraged the girl to engage in harmful behaviors and share explicit content.
To address these issues, Roblox has introduced a new setting that limits users under the age of 13 to only accessing public broadcast messages within games or experiences. Furthermore, the platform will replace age-based content labels with descriptors like “Minimal” and “Restricted” to indicate the type of content users can expect.
Users under the age of nine will only be able to access games labeled as “Minimal” or “Mild” by default. Additionally, users under 13 will be restricted from searching, discovering, or playing unlabeled experiences on the platform.
Restricted content will remain inaccessible until a user reaches the age of 17 and verifies their age. These new safety measures aim to create a safer environment for young users on Roblox.
(Reporting by Rishi Kant in Bengaluru; Editing by Tasim Zahid)