From Majdal Shams to Rafah: How AI airbrushed Israel's genocide

Our use of AI-generated images to depict Israeli war crimes is normalising the genocide in Gaza and undermining our resolve for change, writes Sarah Amr.
4 min read
29 Jul, 2024
In Gaza, AI-generated images both sanitise and distort what's happening on the ground, writes Sarah Amr [photo credit: Lucie Wimetz/The New Arab]

The reliance on AI-generated images to distract and depict Israel's assault on Gaza speaks volumes about our unwillingness to face the reality of genocide.

Over the weekend, Israel's official X account tweeted out an AI-generated image of a luscious, impossibly green football pitch surrounded by a ghetto of one-storied, khaki-brown dwellings. On the grass lay a group of lifeless bodies. The photo's caption: "All Eyes on Majdal Shams."

For the Israeli hasbara machine, the deaths of 12 Druze youth in the Israeli-occupied Golan Heights is nothing more than an opportunistic PR stunt: to capitalise upon tragedy, sow division, and present themselves as a protector.

Thankfully, their propaganda was soon exposed: the far-right Israeli politician Bezalel Smotrich was thrown out of Majdal Shams by Druze elders and the community quickly took to social media to remind us they weren't the docile, pro-Israeli community that their occupiers hoped they'd be.

However, the "All Eyes on Majdal Shams" stunt is nonetheless revealing. It shows how low Israel will stoop, the extent to which AI has become normalised in the optics of warfare, and how Israel compulsively steals and co-opts Palestinian symbols. The pro-Israeli "All Eyes on Majdal Shams" is clearly a response to the pro-Palestinian "All Eyes on Rafah" graphic that went viral two months ago. 

However, the problem with AI-generated images is that they both sanitise and, in the Israeli case, distort what's happening on the ground. In the case of Gaza, they make Palestinian suffering more palatable and shift our focus towards 'beating the algorithm' rather than mobilising for change.

For instance, when Israel bombed displaced Palestinian civilians living in tents in Rafah on May 26 — killing 45 and injuring 246 — the scenes were horrific. Footage of beheaded children and burned corpses flooded social media; testimonies were equally traumatising.

But Palestinian and pro-Palestinian artists turned to digital art and AI to soften these brutal scenes: beheaded babies were depicted with flowers replacing their heads and AI-generated images like the viral "All Eyes on Rafah" were entirely devoid of suffering and Israel's genocidal intent.

More recently, AI-generated images were used to show 70-year-old Dawlat al-Tanani being attacked by an Israeli army dog in her home in Jabalia refugee camp in north Gaza and to also show 24-year-old Mohammed Bahar, a Palestinian man with Down Syndrome, left to die after being mauled by an Israeli army dog. 

In Mohammed's case, no footage was available, the AI image merely showed that he had Down Syndrome without any actual resemblance. In both cases, the images circulated widely, detached from the immediate and real horror of the actual events.

By relying on beautified images, we distance ourselves from the truth and diminish the urgency of action. This form of collective gaslighting isn't new; I should know. 

In the West Bank, we've long lied to ourselves that we live normal lives, convincing ourselves of a false sense of security. During the Oslo years, questions like "What did the martyr do to die?" reflected a tendency to rationalise or even blame the victims of violence, allowing us to maintain the facade. 

By creating and sharing AI-generated images that sanitise our reality, we are choosing an easier path — one that allows us to witness without fully engaging or empathising.

For over 70 years, the Palestinian people have endured the most awful forms of systematic Israeli violence. Children in Gaza carry their siblings' remains in their school backpacks, mothers mourn their offspring, and men have been handcuffed and buried alive. 

Despite this, we often find ourselves helpless, seeking comfort in illusions.

To live and die under Israel's occupation, we're forced to deceive ourselves. This self-deception perpetuates an internal conflict where our survival instincts clash with our yearning for liberation. 

Our reliance on AI-generated images risks diluting the authenticity and credibility of the Palestinian story. It undermines the visceral impact of archival footage and citizen journalism that has sprung up in recent years.

We cannot allow this superficial trend to weaken our position against Israel's propaganda machine.

To support Palestine means constantly engaging with what's happening, however graphic it may be. Images, spectacles and the interpretation of signs are the new determinants of the political economy; we cannot allow AI-generated symbols to saturate our minds.

In Arabic, the word shaheed has a double meaning: to be both martyr and witness. In their last moments, 40,000 Palestinians in Gaza witnessed the horror of unrestrained Israeli violence. To stand with those lost, it's our responsibility to witness their troubled existence in its full form, with no filter or airbrushing.

Sarah Amr is a Palestinian writer interested in media discourse and liberation movements. She holds a Bachelor of Arts in Media and Communications from the University of Sussex.

Have questions or comments? Email us at: editorial-english@newarab.com

Opinions expressed in this article remain those of the author and do not necessarily represent those of The New Arab, its editorial board or staff.