09/07/2024
                                            Every day, people use Facebook to share their experiences, connect with friends and family, and build communities. It’s a service for more than 2 billion people to freely express themselves across countries and cultures and in dozens of languages.
Meta recognizes how important it is for Facebook to be a place where people feel empowered to communicate, and we take our role seriously in keeping abuse off the service. That’s why we developed standards for what is and isn’t allowed on Facebook.
These standards are based on feedback from people and the advice of experts in fields like technology, public safety and human rights. To ensure everyone’s voice is valued, we take great care to create standards that include different views and beliefs, especially from people and communities that might otherwise be overlooked or marginalized.
Please note that the US English version of the Community Standards reflects the most up to date set of the policies and should be used as the primary document.
Our commitment to voice
The goal of our Community Standards is to create a place for expression and give people a voice. Meta wants people to be able to talk openly about the issues that matter to them, whether through written comments, photos, music, or other artistic mediums, even if some may disagree or find them objectionable. In some cases, we allow content—which would otherwise go against our standards—if it’s newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments. In other cases, we may remove content that uses ambiguous or implicit language when additional context allows us to reasonably understand that the content goes against our standards.
Our commitment to expression is paramount, but we recognize the internet creates new and increased opportunities for abuse. For these reasons, when we limit expression, we do it in service