Advertisement

Meta Oversight Board launches 'expedited review' of posts showing Gaza conflict scenes

Meta said Thursday its oversight board is using expedited review for the first time to decide two Israel-Hamas war content cases on Instagram and Facebook. The graphic content was removed, then restored. The board will make the final decision within 30 days on whether the content is allowed to remain. Photo by Terry Schmitt/UPI
Meta said Thursday its oversight board is using expedited review for the first time to decide two Israel-Hamas war content cases on Instagram and Facebook. The graphic content was removed, then restored. The board will make the final decision within 30 days on whether the content is allowed to remain. Photo by Terry Schmitt/UPI | License Photo

Dec. 7 (UPI) -- Meta's Oversight Board Thursday said it is reviewing a pair of posts on Instagram and Facebook depicting scenes from the Israel-Hamas war that were removed and later restored.

The posts both violated various rules surrounding content allowed on Meta's platforms but were later reposted with screens warning about the graphic nature of the content as the company revised its policy guidance in response to the conflict in Gaza.

Advertisement

Meta said it was implementing its expedited review process for the first time, meaning decisions in both cases will be published within 30 days.

"The Oversight Board's bylaws provide for expedited review in 'exceptional circumstances, including when content could result in urgent real-world consequences,'" Meta said in a statement. "The Board believes that the situation in Israel and Gaza reaches this threshold. This will be the first time the Board has used the expedited review process."

The first case arises from a user appealing Meta's removal of content from Instagram that showed what appears to be the aftermath of a military strike on Al-Shifa Hospital in Gaza City.

The November post included a video depicting injured and dead people including children lying on the ground, some crying.

Advertisement

A caption in both English and Arabic below that video said "the hospital has been targeted by the 'usurping occupation,' a reference to the Israeli army, and tags human rights and news organizations."

Meta removed the post citing a violation of the company's violent and graphic content policy, but reversed the original decision after deciding to take up the case. The content was restored with a "mark as disturbing" warning.

The second case was a user appeal to restore a Facebook video showing a woman on a motorbike begging abductors not to kill her as she is taken hostage.

A man is shown being led away by captors. A caption said the captors were Hamas and urges people to watch the video to gain "a deeper understanding" of what Israel experienced after the Oct. 7 attacks.

Meta removed that content under its Dangerous Organizations and Individuals policy. Meta has designated Hamas as a dangerous organization and Oct. 7 as a terrorist attack.

"Meta's Dangerous Organizations and Individuals policy categorically prohibits third-party imagery depicting the moment of designated terror attacks on visible victims," Meta said in a statement. "However, in the weeks following the October attacks, Meta revised its policy guidance in response to trends in how hostage kidnapping videos were being shared and reported on. This resulted in Meta reversing its original decision in this case, restoring the content with a warning screen."

Advertisement

Both these content cases will be reviewed by Meta's Oversight Board which will deliberate and decide whether the content in question should be allowed. The decision will be binding on the company.

According to Meta the board is independent. It evaluates content decisions to remove or leave up Facebook and Instagram content.

Latest Headlines