Roblox to ban children under 13 from sending private messages

Roblox to ban young children from messaging others

Roblox, one of the world’s most popular online gaming platforms, has announced a new policy to block users under 13 from sending direct messages without parental consent. The changes, aimed at improving child safety, will be fully rolled out by March 2025.

Enhanced parental controls and safety measures

Roblox’s new update will prevent children under 13 from sending private messages across the platform by default. If a parent or guardian wants to enable this feature, they must verify their identity using a government-issued ID or credit card.

The platform boasts 88 million daily active users and has faced increasing pressure to implement stricter safety measures. Research from Ofcom shows Roblox is particularly popular among children aged 8 to 12 in the UK, making enhanced child protection a critical focus.

“Safety has always been our priority as the platform continues to grow,” said Matt Kaufman, Roblox’s chief safety officer. “We want to ensure a safe and enjoyable experience for all users, regardless of age.”

New tools for parents

The platform is introducing new features that allow parents to monitor and manage their child’s activity with greater ease. The updated parental dashboard will display information such as screen time, friend lists, and content settings. Parents can also set daily playtime limits and restrict access to certain games based on content maturity.

In addition, Roblox will be replacing age-based game recommendations with content labels. This shift aims to give parents more nuanced information about the nature of the games, helping them decide what is appropriate for their child’s maturity level. Content labels will range from “minimal,” which may include mild violence, to “restricted,” which features strong violence or mature themes.

Under the new guidelines, users under nine will only have access to “minimal” or “mild” content by default, while parental consent is required for children to play games labeled as “moderate.” “Restricted” games will remain inaccessible to anyone under 17 unless they verify their age through the platform’s tools.

New rules for game developers

Starting December 3, Roblox developers will be required to specify whether their games are suitable for children. Games that do not provide this information will be automatically blocked for users under 13. The policy change follows a recent decision to bar under-13s from participating in “social hangouts,” which often involve real-time text or voice chat communication.

Compliance with the UK’s Online Safety Act

The update comes as tech platforms prepare to meet new regulations under the UK’s Online Safety Act, designed to protect children from illegal and harmful content. Ofcom, the regulatory body overseeing the act, has warned companies that they could face penalties if they fail to implement adequate child safety measures. Roblox’s recent changes reflect its effort to comply with these upcoming standards.

Kaufman emphasized the importance of parents’ active role in their child’s online experience. “We encourage parents to work with their kids to set up accounts accurately, ensuring the correct age is entered during sign-up,” he said. “Our goal is to keep all users safe and provide a platform where children can have fun without unnecessary risks.”

Simplified maturity guidelines

To further assist parents, Roblox will simplify its maturity guidelines. Instead of age-based recommendations, the platform will use content labels to indicate the level of maturity. These labels will help parents decide based on their child’s readiness rather than a specific age threshold.

The categories include:

This change aims to make content ratings more transparent and easier for parents to understand.

Looking ahead

As the platform gears up to implement these changes, it remains to be seen how the new policies will impact user engagement, especially among younger players. However, Roblox’s commitment to enhancing safety signals a proactive approach in adapting to the evolving regulatory landscape and growing concerns about online child safety.

The company hopes these measures, combined with active parental involvement, will create a safer environment for its youngest users.

Exit mobile version