International Finance
MagazineTechnology

Is Roblox safe enough for young players?

Roblox
Roblox has already established itself as one of the largest video games in the world, averaging over 80 million players per day in 2024

Roblox, an online game platform and game creation system developed by Roblox Corporation, allows users to program and play games created by themselves or others. The platform is now rolling out new features aimed at making it safer for minors, including a revamped friend system, privacy tools, and age-verification services that users can submit via a video selfie.

In Roblox’s old friend system, players had no distinction between people they knew casually or online and those they considered close friends. The platform’s new tiered system introduces “Connections” and “Trusted Connections,” specifically for people players know and trust. To access “Trusted Connections” and its benefits, users must first complete age verification, which requires submitting a video selfie. Once they’ve submitted their video, it will be compared against an AI-driven “diverse dataset” to estimate their age. If the user appears to be under 13, they will automatically lose access to any features not deemed age-appropriate.

Teen users who pass the age check will be able to use the “Trusted Connections” feature to add anyone ages 13 to 17. Anyone 18 or older will need to be added either via an in-person QR code scan or by phone number. With “Trusted Connections,” Roblox has removed filters—including inappropriate language and personally identifiable information—on party voice and text chats for users 13 and up.

These communications remain subject to Roblox’s community standards and moderation, but the company hopes removing filters will keep users on its platform instead of moving to spaces like Discord. By keeping players within Roblox, the company can monitor their activity. Users whose ages cannot be determined with ‘high confidence’ will have their age remain unconfirmed and will need to use ID verification to proceed.

The company says it will allow for parental consent in the future; biometric data is deleted after 30 days, except where required by a warrant or subpoena. As WIRED raised the issue of 13-year-olds not having government-issued IDs to chief safety officer Matt Kaufman, he replied, “That is a problem. In North America, or maybe the United States in particular, that’s not common. In other parts of the world, it is much more common to have a photo ID.”

If a child is unable to obtain verification due to a lack of ID, they can get verified on the gaming platform through their parents. If their parents are unable to do so for any reason, kids won’t be able to use “Trusted Connections.”

The latest move comes amid increasing state-level demands in the US for age verification on online platforms. Identity verification company Persona has provided the age-screening technology. Several American states, including Utah, have recently implemented laws requiring app store owners like Apple and Google to verify the age of their users. Social media companies such as Meta, X (formerly Twitter), and Snap have already agreed that app store operators are responsible for age verification. However, Apple and Google have expressed differing views on this issue.

Reform comes after controversy

Until now, private messages on Roblox were filtered to block profanity and other flagged content. As per Kaufman, age-screened access to unfiltered chats could allow teens and adults to continue using the platform for open communication rather than turning to external apps. Roblox will still monitor conversations for potential risks.

Users who have already verified their age using ID will not need to complete the facial analysis process. The company said the age-estimation software is “optional,” and teens can alternatively verify their age using a government-issued ID or, in the future, parental consent.

Roblox also introduced new tools focused on privacy and well-being, including daily time limits, online status controls, a “do not disturb” mode, and a summary of time spent on the platform. With teen permission, parents can link accounts to view activity, trusted connections, and spending insights.

The reform comes after a row, as an undercover investigation into the gaming platform exposed serious lapses in child safety. The study from CCN.com found that despite parental controls and public reassurances, children as young as five were being exposed to explicit content, interacting with adults, and bypassing weak verification systems.

The report, compiled by digital behaviour experts called “Revealing Reality,” which came out in April this year, discovered that “despite the safety features in place, adults and children can easily interact in the same virtual spaces, with no effective age verification or separation.”

Roblox has already established itself as one of the largest video games in the world, averaging over 80 million players per day in 2024, with roughly 40% of them being under the age of 13. As part of their investigation, “Revealing Reality” researchers created multiple fake Roblox accounts using people as young as five, nine, ten, thirteen, and over forty. They found that the account registered as a 10-year-old could “freely access highly suggestive environments.”

These included virtual hotels with private rooms where characters wore sexually suggestive outfits. Within these spaces, children could engage in conversations “that often strayed into adult themes.” In fact, a video posted by Revealing Reality showed a character encountering avatars making indecent noises and actions. The 10-year-old account also entered a virtual dance club, where avatars were seen in scenarios not fit for a child’s upbringing.

However, Kaufman, responding to the report, said, “At Roblox, trust and safety are at the core of everything we do. We continually evolve our policies, technologies, and moderation efforts to protect our community, especially young people. In 2024 alone, we added more than 40 new safety enhancements, and we remain fully committed to going further to make Roblox a safe and civil place for everyone.”

Another Roblox spokesperson told CCN that Revealing Reality’s investigation “omits important contextual facts that are essential to an accurate understanding of safety on our platform.”

However, this year, Florida Attorney General James Uthmeier issued a subpoena to Roblox regarding its marketing to children. He stated that his office had received numerous reports about children being exposed to “graphic or harmful material on the gaming platform, as well as predatory adults being able to message minors freely.”

“As a father and Attorney General, children’s safety and protection are a top priority. There are concerning reports that this gaming platform, which is popular among children, is exposing them to harmful content and bad actors. We are issuing a subpoena to Roblox to uncover how this platform is marketing to children and to see what policies they are implementing, if any, to avoid interactions with predators,” he stated.

There have been claims by former Meta employee Kelly Stonelake that the tech giant knowingly allowed children under 13 access to its virtual reality platform, Horizon Worlds, not just Roblox. Stonelake, who submitted her whistleblower statement through a complaint filed by the non-profit Fairplay, alleged that Meta permitted underage users to register using adult accounts, enabling the company to collect data on them without proper parental consent.

In response, Meta directed the media to the company’s policy, which suggests that parents should have control over accounts registered by children aged 10 to 12 on the Quest headset for accessing Horizon Worlds. The same regulation also mandates Meta to report users suspected of being underage, followed by the removal of such accounts if they are confirmed to belong to pre-teens.

Despite Roblox announcing new safety features for its underage users, while giving parents more control over what their children can access, Damon De Ionno, Research Director at Revealing Reality, said, “The new safety features announced by Roblox don’t go far enough. Children can still chat with strangers who aren’t on their friends list. With six million experiences on the platform, many with inaccurate descriptions or ratings, how can parents realistically moderate what their kids are doing?”

Kaufman highlighted that Roblox boasts approximately 98 million users across 180 countries and is especially popular among teenagers, with over 60% of its users over the age of 13. However, the company has faced significant challenges regarding the safety of minors and issues related to predators on the platform.

According to a 2024 Bloomberg report, police have arrested at least two dozen people who’ve used Roblox as a platform for grooming, abuse, or abduction. Roblox has also been the subject of several lawsuits. These include a class-action lawsuit alleging the company harvests user data, including that of minors, and a federal lawsuit alleging a 13-year-old girl was exploited and sexually groomed via Roblox and Discord.

Credibility still an issue

Kaufman called Roblox “one of the safest places online for people to come together and spend time with their friends and their family.”

However, Kirra Pendergast, founder and CEO of Safe on Social, an online safety organisation operating worldwide, said that Roblox’s latest safety measures are largely opt-in, therefore putting “responsibility on minors to identify and manage risks, something that contradicts everything we know about grooming dynamics.”

“Features like machine-learning age-estimation tools, for example, can incorrectly categorise users as older or younger; in-person code scanning assumes that in-person QR code scanning is inherently safe,” she told.

“Predators frequently use real-world grooming tactics. A QR scan doesn’t verify a trusted relationship. A predator could build trust online, then manipulate the child into scanning a QR code offline, thus validating a ‘Trusted Connection’ in Roblox’s system. Real protection would require guardian co-verification of any connections, not child-initiated permissions,” Pendergast added.

Furthermore, “Trusted Connections” applies only to chat, which leaves “large surface areas exposed, making it a brittle barrier at best.”

When asked how an in-person QR code keeps minors safe from real-world grooming tactics, Kaufman echoed a press briefing comment that there is no “silver bullet.”

He continued, “It’s many systems working together. Those systems begin with our policies, our community standards.”

“It’s our product, which does automated monitoring of things; it’s our partnerships; it’s people behind the scenes. So, we have a whole suite of things in place to keep people safe. It is not just a QR code, or it is not just age estimation, it’s all of these things acting in concert,” he added.

As per Kaufman, Roblox is “going farther” than other platforms by not allowing kids ages 13 to 17 to have unfiltered communication without going through Trusted Connections.

“We feel that we’re really setting the standard for the world in what it means to have safe, open communication for a teen audience,” he noted.

According to Roblox’s briefing, the updates are part of Roblox’s typical development process and haven’t been “influenced by any particular event” or feedback.

Commitment to online safety

Kaufman, while informing WIRED that the heightened scrutiny and discussion of the game so far didn’t have a dramatic impact on the company’s plans, remarked, “What we’re doing with this announcement is also trying to set the bar for what we think is appropriate for kids.”

Looking at technology like generative AI, he said, “The technology may have changed, but the principles are still the same. We also look at AI as a real opportunity to scale some of the things we do in safety, especially moderation. AI is central to that.”

He also said Roblox believes it’s important for parents and guardians to “build a dialogue” with their kids about online safety. “It’s about having discussions about where they’re spending time online, who their friends are, and what they’re doing. That discussion is probably the most important thing to do to keep people safe, and we’re investing in the tools to make that happen,” Kaufman said.

Kaufman, while adding that Roblox is aware that families have different expectations of what’s appropriate online behaviour, also noted, “If parents decide what’s appropriate for their kid, it may not match the decisions that we might make or I might make for my own family. But that’s OK, and we respect that difference.”

Dina Lamdany, who leads product for user settings and parental controls, said in that briefing that as teenagers experiment with their independence, “It’s really a moment where they need to learn the skills they need to be safe online. Teen users can grant dashboard access to their parents, which gives parents the ability to see who their child’s trusted connections are. We won’t be notifying parents proactively right now.”

Online safety, especially for minors, is an evolving problem in game spaces. Nintendo recently introduced GameChat with the Switch 2, a social feature that allows players to connect with friends without leaving the platform. For younger users, it relies heavily on parental controls, while adults are expected to be proactive about who they chat with.

So how does “Switch 2” work? Users can communicate with their friends and family while playing a game via a microphone built into the console itself, while a separate camera also allows for video chat.

On Nintendo’s official GameChat page, an instruction reads, “Mobile phone number registration required to use GameChat.” On the same page, the company states that, as an “additional security measure,” text message verification is required to set up GameChat.”

It further states that text message verification for children “wanting to make the most of the Switch 2’s Game- Chat feature” will need to use a number registered to a parent or guardian’s account to get access.

The system’s privacy policy further warns about possible monitoring of video and audio interactions between users. However, some are concerned whether it amounts to surveillance. Kaufman, jumping the gun, said that Roblox takes privacy seriously.

Pendergast, however, seemed unimpressed with Roblox’s claims, saying that if the online gaming portal wants to lead the way in safety, it has to take harder stances.

“It must stop relying on children and parents to manage their own protection and start building environments where trust is not optional but is instead engineered as safety by design. Age estimation, parental dashboards, and AI behavioural monitoring must be default, not optional, creating a baseline of systemic defence, not user-managed or user-guardian-managed risk. Otherwise, parents and children are left to do the heavy lifting, often without the digital and online safety literacy required,” she concluded.

What's New

Trump’s tariffs shake world trade

IFM Correspondent

The great crypto reckoning

IFM Correspondent

The collapse of Canada’s promise

IFM Correspondent

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.