Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2024-002-FB-UA, 2024-003-FB-UA
Today, February 8, 2024, the Oversight Board selected a case bundle appealed by Facebook users concerning two posts regarding the electoral process in Australia during the 2023 Australian Indigenous Voice referendum.
The first piece of content contains a screenshot of tweets from the Australian Electoral Commission which discuss the issue of individuals voting more than once. The accompanying caption stated, “So it is official. Go out, vote early, vote often, and vote NO.” The second piece of content shared a screenshot of one of the same tweets from the Australian Electoral Commission with text overlay that stated “[t]hey are setting us up for a ‘Rigging’.. smash the voting centres people it’s a NO, NO, NO, NO, NO.”
Meta took down both pieces of content for violating our policy on Coordinating Harm and Promoting Crime, as laid out in the Facebook Community Standards. For the second piece of content, the phrase “smash the voting centres” was taken down under our policy on Coordinating Harm and Promoting Crime for advocating to inundate the election with duplicate voting. The same phrase can also be interpreted as a call to literally destroy the voting center buildings, which violates our policy on Violence and Incitement, as laid out in the Facebook Community Standards.
In accordance with our policy on Coordinating Harm and Promoting Crime, Meta prohibits “facilitating, organizing, or admitting to certain criminal or harmful activities,” including “advocating, providing instructions for, or demonstrating explicit intent to illegally participate in the voting process.” Additionally, Meta does not allow threats of violence against a place if they could “lead to death or serious injury of any person that could be present at the targeted place.”
We will implement the board’s decision once it has finished deliberating, and will update this post accordingly. Please see the Board's website for the decision when they issue it.
We welcome the Oversight Board’s decision today, May 9, 2024, on this case bundle. The Board upheld Meta's decision to remove both pieces of content from Facebook.
After conducting a review of the recommendation provided by the Board, we will update this post with our initial response to that recommendation.
To ensure users are fully informed about the types of content prohibited under the “Voter and/or census fraud” section of the Coordinating Harm and Promoting Crime Community Standard, Meta should incorporate its definition of the term “illegal voting” into the public-facing language of the policy prohibiting: “advocating, providing instructions for, or demonstrating explicit intent to illegally participate in a voting or census process, except if shared in a condemning, awareness raising, news reporting, or humorous or satirical contexts.”
The Board will consider this recommendation implemented when Meta updates its public-facing Coordinating Harm and Promoting Crime Community Standard to reflect the change.
Our commitment: We will incorporate our definition for “illegal voting” into the Coordinating Harm and Promoting Crime Community Standard. We will also provide additional detail about how we assess content calling for or encouraging illegal participation in a census.
Considerations: As the Board outlines in its decision, our Coordinating Harm and Promoting Crime policy does not allow content calling for illegal participation in a voting process. While we do note this in the Community Standards, we do not currently include additional details about how that is defined.
In the coming months, we will update our external Community Standards with examples to clarify and define what we may consider “illegal voting” content. We also intend to share details about what constitutes advocating, providing instructions for, or demonstrating explicit intent to illegally participate in a census process.
We will provide an update on this recommendation in the next public Oversight Board Report.