Skip to main content

Building a Safe and Civil Community

August 19, 2020

by David Baszucki


Community

Dear Roblox Community –

Roblox was designed for kids and teens, and we have a responsibility to make sure our players can learn, create, and play safely. This continues to be our most important priority and that will never change. Safety is not a retrofit; it’s been in our DNA since day one.

We have come a long way together since the very early days when Erik, Matt, John, and I built the first version of Roblox’s moderation system, and we had at most 40 players on Roblox at any one time. We recognized very early on that building a safe and civil community was essential to our vision of connecting the world through play. Starting in those early days, we each spent many shifts acting as the first moderators on Roblox, and this first-hand experience stays with us today.

We have no tolerance for content or behavior that violates our rules, and we work tirelessly and relentlessly to create a safe, civil, and diverse community. That’s why we have a stringent safety system – one of the most rigorous of any platform, going well beyond regulatory requirements:

  • We have a team of 1,600 protecting our users and monitoring for safety to detect inappropriate content 24/7, with a combination of machine scanning and human moderation. We take swift action (typically within minutes) to address any content or any developer that violates our terms of use.
  • We filter all text chat on the platform to block inappropriate content, including questions about personal information and instructions on how to connect on other, less restrictive third-party chat apps.
  • We conduct a safety review of all images, audio, and video files through a combination of human review and machine detection prior to them becoming available on our platform.
  • We work closely and transparently with regulators, authorities, and safety groups in every country we operate and promptly report any suspected child exploitation, abuse materials, or online grooming to relevant authorities like the National Crime Agency and Child Exploitation and Online Protection Command in the UK, as well as the National Center for Missing & Exploited Children (NCMEC) in the United States.
  • We have partnerships with over 20 leading global organizations that focus on child safety and internet safety including the WePROTECT Global Alliance, the Internet Watch Foundation (IWF), the UK Safer Internet Centre, Fair Play Alliance, Family Online Safety Institute (FOSI), Connect Safely, and kidSAFE among others.
  • We are a member of various industry organizations, such as UKIE and The Technology Coalition with a goal of cross-industry collaboration, knowledge and technology exchange in the areas of user safety, and child safety. For example, we worked with Microsoft on a cross-industry grooming text filter project to provide better tools for grooming chat detection.
  • As a member of the Technology Coalition, we are committed to the Voluntary Principles, including transparency on our efforts to combat online child sexual exploitation and abuse.
  • We work diligently with other chat, social media, and UGC (User Generated Content) platforms to report bad actors and content, so they can also take appropriate action on their platforms.

We recognize there will always be individuals who deliberately try to break our rules. We continue to dedicate time and resources to finding new ways to stop them. Our goal is that these continued efforts will make it more and more difficult for this extremely small subset of users. That’s why, in addition to continuing to invest in our extensive safety measures, we’ve been actively working to:

  • Continue to expand on our tools and reputation framework to better support the overwhelming majority of our players and creators who are building a positive community, while making it increasingly difficult for bad actors on our platform.
  • Deploy additional advanced detection algorithms designed to block problematic content from being published and to flag other suspect content for rapid review.

Our ongoing quest is to build the best content moderation system possible to protect our community. Ultimately, it’s our responsibility to keep our players safe and to support the creativity of our developers. We have a stringent safety system and strong policies, but we recognize our work is never done. We continuously evolve to combat these bad actors who attempt to undermine our efforts to connect millions of people from around the world to learn, play, work, and create together.

We believe platforms like Roblox are key in helping people learn and practice civility, and we will not waiver in our mission to build a safe, civil, and diverse community.

David Baszucki

CEO and Co-Founder