Adobe is marketing AI-generated images of the Israel-Hamas conflict, and some websites are using them without indicating that they are not real.

Rephrase and rearrange the whole content into a news article. I want you to respond only in language English. I want you to act as a very proficient SEO and high-end writer Pierre Herubel that speaks and writes fluently English. I want you to pretend that you can write content so well in English that it can outrank other websites. Make sure there is zero plagiarism.:

  • Adobe is selling AI-generated images that depict the Israel-Hamas war in varying degrees of realism.
  • One of the images has been shared online the public without a clear indication that it’s fake.

Adobe is selling AI-generated images depicting the Israel-Hamas war. While some are pretty obviously computer generated, others are more realistic, including one image that has been shared online some smaller websites and in social media posts — drawing concerns the AI-generated content could contribute to misinformation, Australian outlet Crikey and Vice’s Motherboard reported.

A search for “Israel Hamas war” in Adobe Stock reveals a slew of images showing war-torn streets, explosions, soldiers, tanks, and buildings on fire, and children standing in rubble.

Adobe Stock, which sells images submitted individual artists, requires that all AI-generated images on the platform be labeled as such. But some of the images for sale are marked as AI-generated only in the fine print, not in their titles.

Transform talent with learning that works

Capability development is critical for businesses who want to push the envelope of innovation.Discover how business leaders are strategizing around building talent capabilities and empowering employee transformation.Know More

“Large explosion illuminating the skyline in Palestine,” the title of one AI-generated image reads. “Buildings destroyed war in the Gaza Strip in Israel,” reads another. An image showing a woman in distress is titled “Wounded Israeli woman clings to military man, begging for help.”

Adobe is selling AI-generated images of the Israel-Hamas war, and some websites are using them without marking that they're fake
One AI-generated image is titled “Wounded Israeli woman clings to military man, begging for help.”Adobe

Some of the image titles do mention AI. One AI-generated image, titled “conflict between israel and palestine generative ai,” looks similar to actual images from the war and was shared online.

Adobe is selling AI-generated images of the Israel-Hamas war, and some websites are using them without marking that they're fake
“conflict between israel and palestine generative ai” looks very similar to actual pictures from the conflict.Adobe

And a Google reverse-image search for the image reveals shows instances where it’s been used across the internet in posts, videos, and on social media alongside the original Adobe link. The search also results in other very similar, presumably real images from the conflict. It’s unclear whether those who used the AI-generated image on their websites or in their social media posts were aware that it isn’t a real photo.

Adobe is selling AI-generated images of the Israel-Hamas war, and some websites are using them without marking that they're fake
A Google reverse image search of the AI-generated image shows several instances where it appears across the internet, along with very similar presumably real life images of the conflict.Adobe, Google

“These specific images were labeled as generative AI when they were both submitted and made available for license in line with these requirements,” an Adobe spokesperson said in a statement to Insider. “We believe it’s important for customers to know what Adobe Stock images were created using generative AI tools.”

“Adobe is committed to fighting misinformation, and via the Content Authenticity Initiative, we are working with publishers, camera manufacturers and other stakeholders to advance the adoption of Content Credentials, including in our own products,” the statement continued. “Content Credentials allows people to see vital context about how a piece of digital content was captured, created or edited including whether AI tools were used in the creation or editing of the digital content.”

Misinformation and disinformation about the Israel-Hamas war is already rampant online. Misleading content, old videos and photos from other conflicts in other parts of the world, and even video game footage has been presented as real photos from the still-unfolding conflict.

And as AI-generated content becomes both more realistic and more widespread, it can be difficult to discern whether an image is real or not, especially when even AI image detectors can be easy to fool.

And, as is the case with AI-generated images — it’s much harder to police the labeling of an AI image after it’s been downloaded a user. After all, there’s no guarantee someone will clearly note the image’s origin or that it’s AI.

Henry Ajder, an AI expert who is on the European advisory council for Meta’s Reality Labs, previously shared a few tips with Insider to help distinguish AI-generated images from real ones. AI images can often look “plasticky” or overly-stylized, and might have aesthetic inconsistencies in their lighting, shapes, or other details.

Ajder also suggests asking questions when an image seems a little too sensationalized, like “Who’s shared it?” “Where has it been shared?” and “Can you cross-reference it?” He also suggests doing a reverse image search to find an image’s context.

Related Post