Skip to main content

Building an Inclusive Community

November 18, 2020

by Roblox


Community Digital Well-Being

At Roblox, we continually improve our systems, policies, and moderation efforts to prevent, detect, and block content or behavior that violates our community guidelines and terms of use. We believe it’s our responsibility to keep our players and creators safe, which is why we are transparent about the actions we are taking to foster positive experiences on our platform. Here’s some of the work we’re currently doing:

  • We honor diversity, and we’ve further strengthened and expanded our Community Rules against discriminatory speech, content, or actions, including those which are used to condone or encourage inflammatory actions, hate speech, or any other type of discrimination.
  • We are auditing and enhancing our proprietary text filtering technology and customized rules on what to block, particularly with regard to underrepresented and historically marginalized groups.
  • We are improving our safety systems to better detect context, so that we support community members who identify as part of underrepresented and historically marginalized groups, while further strengthening our algorithms and filters to block terms used in discriminatory or harmful ways.
  • In consultation with experts, academics, and safety partners from across the globe, we are constantly evaluating our moderation policies and auditing our platform to ensure that we are continually evolving to protect our community against emergent extremist or discriminatory terms, memes, and symbols.
  • Roblox is committed to the Five Pillars of Action, which includes transparency in annual reporting (Pillar 5). As an active member of the Tech Coalition, this is a key area of focus which we are contributing to in the coming year.
  • We are expanding our tools and reputation framework to make it even more difficult to try and publish problematic content, and improving our algorithms to prevent discoverability of questionable content so we can swiftly take action.
  • Further, we’re undertaking a thorough review to ensure our current stringent safety systems apply to any content which predates significant advancements we’ve implemented over the last few years.

Our ongoing quest is to build the best content moderation system possible to protect our community. This is why we have dedicated team members across Product and Engineering and other key areas whose sole focus is to design and improve the safety features on our platform. We have a Trust & Safety team of over 1,700 protecting our users and monitoring for safety 24/7, with a combination of machine scanning and human review to detect inappropriate material and take action on any inappropriate content or behavior. We will continually evolve our preventative measures, strengthen our policy-driven enforcement, and share regular updates as we introduce changes.

We believe platforms like Roblox are key in helping people learn and practice civility, and we will not waiver in our mission to build an inclusive community that fosters positive relationships between people around the world.