
As sad as it may sound, there is 100% a disconnect at times between certain companies and their workers and sometimes even their customer base regarding what is actually “going on” at the company. That shouldn’t be surprising on some front, as the companies in question will often try to present themselves as “close to perfect.” That “method” can apply to video games, as we’ve seen many reports of companies having less than stellar experiences. Even Roblox isn’t immune to this kind of thing.
In fact, many would point to Roblox as a place where “it may look nice, but there are a lot of bad things within,” and that point keeps getting brought up. You may recall the comments of the game’s CEO recently, where he did admit that there were issues within the game, but that if parents wanted their kids not to see it, they should “just not let them play the game.” That sparked backlash, as kids aren’t idiots and can get around parents’ controls and barriers if they’re clever enough, especially when they know technology better than their parents at times.
Now, though, things have gotten even worse for the gaming giant, for RevealingReality revealed a new research paper that highlights something deeply disturbing … a troubling disconnect between Roblox’s child-friendly appearance and the reality of what children could potentially experience on the platform.
Key among this test was the team at RevealingReality creating all-new avatars that were “registered” as 13 or below and seeing how well the safeguards that the dev team had been putting into the game over the last year held up. In short, they didn’t. They found it incredibly easy to not just fool the system on their side of things, but that they could interact with adults even when the game’s “safety controls” were supposed to help prevent that for underage players.
That still was only part of the “experience” they had, including using their avatars to access areas that only adult players should be able to reach due to their “visual content” and “suggestive natures,” and when using voice chat, with is supposed to be moderated by an AI system, they heard all manner of suggestive things from other adult players.
While the dev team has promised to make their game a “safe place” for all, this research shows it still has a long way to go to live up to that promise.