I regularly work with Reuters Fact Check (including Reuters Arabic) to assess images and videos that circulate online. Most of these cases involve synthetic media that has been framed as real news. Below is a selection of recent work from June onward.
Convoy image heading to Rafah Crossing

https://www.reuters.com/fact-check/arabic/ICIT4BGBBBPKDMMUVCUOPZLFIU-2025-06-21/
I reviewed an image said to show the “Convoy of Resilience” en route to the Rafah Crossing. Repeated clothing patterns, distorted flags and irregular lighting pointed to diffusion-model generation. The convoy organisers confirmed the image was fabricated.
Giant anaconda in the Amazon River

https://www.reuters.com/ar/fact-check/VTFNSJV2DBKTHB4GURXO5JH72A-2025-06-03/
I analysed a viral clip claiming to show a massive anaconda surfacing beside a boat. The video showed two heads, floating objects that lacked weight and a camera view that did not match the on-screen overlay. These details confirmed synthetic video.
Gaza newborn video

https://www.reuters.com/fact-check/arabic/MLLYWEB7SBLJFF6IJTH63EJZNU-2025-05-25/
I assessed a video that appeared to show a newborn baby from Gaza smiling and raising a victory sign. The child’s fingers changed in number and length between frames and the facial movements were unnaturally smooth. These are known markers of generated video.
Cow tied to a missile

https://www.reuters.com/fact-check/arabic/BTC72XP3VZOIHD4SQIYRWSZVV4-2025-05-29/
I reviewed an image that claimed to show a Pakistani missile with a cow tied to it. Texture repetition, inconsistent shadows and distorted insignia pointed to generative image synthesis rather than a real scene.
Deepfake of Avichay Adraee

https://www.reuters.com/fact-check/arabic/535CSPX6GRIOPF6FJGWGUXUQVE-2025-03-11/
I fact checked a video presented as Avichay Adraee mocking a Syrian official. The lip movements, cloned voice and repeated facial frames matched common deepfake behaviours.
New York tunnel flooding video

https://www.reuters.com/fact-check/arabic/YU5T5BJW2RNODBZ62XPO2WLC7A-2025-08-07/
I examined a video said to show cars sinking in a New York tunnel during July floods. Repeating car designs, twisted metal rails and people who appeared to merge into the scene all signalled synthetic generation. The clip originated from an account known for AI videos.
Collapsing building during US floods

https://www.reuters.com/fact-check/arabic/76BEGAR3ZNPQVERTGZKPHZ3HQY-2025-08-25/
I looked into a video that seemed to show a building rocking and falling into a river. The motion lacked physical realism, the windows stayed perfectly intact and the riverbank warped between frames. These are common traits of AI generated disaster footage.
Aquarium tank collapse

https://www.reuters.com/fact-check/arabic/TESY7IW25NNL3MQM35SFOEXPZI-2025-08-26/
I analysed a shopping-centre video that showed a huge aquarium bursting open. Escalators stayed perfectly still while water moved around them, the glass dissolved into liquid and several people blinked in and out of the scene. These cues showed the clip was synthetic.
Qatar vs UAE football match image

https://www.reuters.com/ar/fact-check/WRYD5EZA7RIPNAOHQYOA2ZXKPQ-2025-10-30/
I reviewed an image that claimed to show phones and car keys thrown onto the pitch after a football match. The shadows did not match, several phones replaced original shoes and the watermark had been edited out. The image was manipulated.
Massive protest in Japan

https://www.reuters.com/fact-check/video-does-not-show-massive-pro-palestinian-protest-japan-2025-08-15/
I checked a video that claimed to show a huge pro-Palestinian protest in Japan. Flag motion, crowd behaviour and banner text all showed irregularities consistent with AI generated footage.
