Explicit AI Images Bundle

UPDATED

JUL 25, 2024

2024-007-IG-UA, 2024-008-FB-UA

Today, April 16, 2024, the Oversight Board selected a case bundle regarding two pieces of content containing artificial explicit images that were appealed by Facebook and Instagram users.

Meta took down both pieces of content for violating our policy on Bullying and Harassment, which prohibits “derogatory sexualized photoshops or drawings,” as laid out in the Facebook Community Standards and Instagram Community Guidelines. For one piece of content, Meta also determined that it violated our Adult Nudity and Sexual Activity policy, as laid out in our Facebook Community Standards.

We will implement the Board’s decision once it has finished deliberating, and will update this post accordingly. Please see the Board’s website for the decision when they issue it.

Case decision

We welcome the Oversight Board’s decision today, July 25, 2024, on this case bundle. The Board overturned Meta’s original decision to leave up the content in the first case. The Board upheld Meta’s decision to take down the content in the second case. Since Meta previously removed the content for both cases, we will take no further action related to this bundle or the content.

When it is technically and operationally possible to do so, we will also take action on content that is identical and in the same context as the first case. For more information, please see our Newsroom post about how we implement the Board’s decisions.

After conducting a review of the recommendations provided by the Board, we will update this page.