Roblox CEO Faces Criticism Over Handling of Child Predator Concerns
Published: 25/11/2025
Article
Roblox Corporation has recently rolled out a new face-scanning feature aimed at strengthening child safety protections on its popular game creation platform. The technology, designed to estimate a user’s age through facial analysis, is intended to support age-gating and moderation efforts. Roblox CEO David Baszucki has been highly vocal about the initiative, offering a series of intense and, at times, controversial remarks during a recent podcast appearance.
Baszucki appeared on a November 21 episode of the Hard Fork podcast, where he spoke with journalists Casey Newton and Kevin Roose about the company’s evolving approach to safety. The conversation began with discussion of the new face-scanning system, expanded into debates over data use, and eventually escalated into disagreements about Roblox’s growing reliance on artificial intelligence for moderation.
When asked about the presence of predators on the platform, Baszucki framed the issue as both a challenge and an opportunity. He emphasized Roblox’s long-term vision of enabling young people to communicate, build, and socialize online, arguing that moderation has been a core concern since the company’s earliest days, when the small founding team handled moderation manually.
As Roblox has grown to massive scale, Baszucki explained, the company has increasingly turned to advanced technology to support safety systems. He highlighted AI-driven tools such as text filters, noting that users often attempt to bypass moderation through creative workarounds. According to Baszucki, these systems have improved significantly and are now better at detecting attempts to share personal identifying information.
The discussion grew more heated as the hosts pressed Baszucki on criticism and lawsuits alleging that Roblox has failed to adequately protect children. Baszucki strongly rejected claims that the platform is commonly used by predators to target kids, asserting that Roblox is innovating faster and more responsibly than many other online social platforms.
He went on to describe what he sees as the positive social impact of Roblox, stating that many children find meaningful communities on the platform, particularly those who feel isolated offline. At the same time, he acknowledged the responsibility of designing safety systems not just for highly engaged parents, but for all families, regardless of their level of technical involvement.
Tension rose further when the conversation turned to reports accusing Roblox of prioritizing profits over child safety. Baszucki dismissed these claims and pivoted to a broader defense of AI-driven moderation, arguing that automated systems reviewing every image and message may ultimately be more effective than human-only moderation at scale.
At one point, Baszucki’s tone shifted noticeably as he joked with the hosts and suggested they were secretly supportive of Roblox’s strategy. He repeatedly emphasized that he was not frustrated, but appeared visibly energized as the discussion continued.
The conversation took an unexpected turn when a hypothetical idea was raised about creating a prediction-style game within Roblox using in-game currency. While the hosts expressed skepticism, Baszucki described the concept as potentially brilliant if implemented in a legal and educational manner, stressing that it would not involve real-world gambling.
Ultimately, Baszucki remained firm in his belief that Roblox’s moderation tools are continuously improving. He pointed to the platform’s evolution from basic text filters to advanced facial age estimation as evidence of the company’s commitment to user safety.
Despite these assurances, Roblox has faced mounting scrutiny in recent months. Investigations and lawsuits from multiple regions have accused the company of failing to do enough to prevent child exploitation and grooming on the platform. At the same time, broader law enforcement agencies have warned that online spaces are increasingly being used to manipulate and exploit young users.
As debates around online safety intensify, Roblox’s growing reliance on AI-driven moderation places the company at the center of a larger conversation about technology, responsibility, and the risks of massive digital communities built around children.