Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2025-015-IG-MR, 2025-016-IG-MR, 2025-017-IG-MR
Today, February 13, 2025, the Oversight Board selected a case bundle referred by Meta regarding three pieces of content posted to Instagram all involving symbols often used by hate groups but which can also have other uses.
The first piece of content was an image of a woman with a part of her face covered by a scarf. The words “Slavic Army” and a kolovrat symbol, a type of swastika used both by neo-Nazis and some pagans without apparent extremist intent, were superimposed on the scarf. The image was accompanied by a caption that expressed the user’s pride in being Slavic and stated that kolovrat is a symbol of faith, war, peace, hate, and love.
The second piece of content concerns a carousel of images which depict a woman in various poses wearing an iron cross necklace and a t-shirt with an AK-47 assault rifle and the words “Defend Europe” printed on it. The Fraktur font on the t-shirt typeface and the Odal (or Othala) rune in the caption – a symbol from the runic alphabet that was used in Europe prior to its replacement by the Latin alphabet – are both associated with Nazis and neo-Nazis. The caption also contained the hashtag #DefendEuorpe, which is a slogan used by white supremacists and other extremist organizations opposing immigration.
The third piece of content also concerns a carousel of images which are drawings of an Odal rune wrapped around a sword with a quote about blood and fate by a German author and soldier who fought in the first and second world wars. The caption shares a selective early history of the rune without mentioning its Nazi and neo-Nazi appropriation, as well as the conclusion that the rune is about “heritage, homeland, and family.” The caption also states that prints of the image are for sale.
Meta determined that the first two pieces of content violated our Dangerous Organizations and Individuals policy, as laid out in the Instagram Community Guidelines and Facebook Community Standards. Meta determined that the third piece of content did not violate our policies and left the content up.
Meta referred this case to the Board because we found it significant and difficult as it creates tension between our values of safety and voice.
While these symbols and others like them may be used to promote dangerous organizations and individuals, be used by members of these groups to identify themselves, or to show support for the group’s objectives, prohibiting these symbols entirely could limit discussions of history, linguistics, and art.
We will implement the Board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the Board’s website for the decision when they issue it.