How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2023-30-FB-UA, 2023-31-FB-UA
Today, the Oversight Board selected a case bundle appealed by multiple users regarding two separate Facebook posts, both concerning political content within the context of Greek elections.
The first piece of content is a photo of a campaign leaflet for a candidate for Parliament associated with the Spartans party. The leaflet states that “Ilias Kasidiaris supports the party Spartans.” The second piece of content is a photo of the National Party-Greeks’ logo which includes text in the photo that translates to “Spartans.”
Meta took down both pieces of content for violating our policy on Dangerous Individuals and Organizations, as laid out in the Facebook Community Standards.
In accordance with this policy, Meta removes any content that praises a designated entity or organization. Meta also does not allow symbols that represent dangerous organizations or individuals unless posted with the explicit intention of neutrally discussing or condemning dangerous organizations and individuals or their activities.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s response today, March 28, 2024, on this case. The Board upheld Meta's decision to remove both pieces of content from Facebook.
After conducting a review of the recommendation provided by the Board, we will update this post with initial responses to that recommendation.
To provide greater clarity to users, Meta should clarify the scope of the policy exception under the Dangerous Organizations and Individuals Community Standard, which allows for content “reporting on, neutrally discussing or condemning dangerous organizations and individuals or their activities” to be shared in the context of “social and political discourse.” Specifically, Meta should clarify how this policy exception relates to election-related content. The Board will consider this implemented when Meta makes this clarification change in its Community Standard.
The Board will consider this implemented when Meta makes this clarification change in its Community Standard.
Our commitment: We will update our Dangerous Organizations and Individuals Community Standards to clarify our updated approach to content related to social and political discourse including examples. We will clarify in our external policy that this updated approach includes allowing discussion related to elections when the content is otherwise non-violating.
Considerations: In December 2023 we introduced a number of updates to our Dangerous Organizations and Individuals Policy internally and in our Community Standards as a result of ongoing policy development. As part of policy development, we updated definitions for “Glorification,” “Support,” and “Representation.” This update also included adopting an allowance for social and political discourse of DOIs, as we recognized that our former policy approach may have captured some content that did not actually praise, represent, or materially support a DOI. We implemented this allowance for social and political discourse in certain instances such as elections, peace agreements, human rights-related issues, news reporting and academic, neutral and condemning discussion along with enforcement changes to ensure users are not unduly penalized for sharing content that is discussing a DOI but not intended to glorify, represent, or materially support an entity in an election, for example.
As we noted in our announcement about these changes, this allowance was prompted by discussion with and feedback from internal and external stakeholders. While we have made this update internally, we will share further details about our social and political discourse allowance, including examples related to elections, in the Community Standards.
In response to this recommendation, we will introduce these updates to our Dangerous Organizations and Individuals Policy to our Community Standards to more clearly share instances where our policy allows elections-related discussion.