#FactCheck: Viral AI image shown as AI -171 caught fire after collision
Executive Summary:
A dramatic image circulating online, showing a Boeing 787 of Air India engulfed in flames after crashing into a building in Ahmedabad, is not a genuine photograph from the incident. Our research has confirmed it was created using artificial intelligence.

Claim:
Social media posts and forwarded messages allege that the image shows the actual crash of Air India Flight AI‑171 near Ahmedabad airport on June 12, 2025.

Fact Check:
In our research to validate the authenticity of the viral image, we conducted a reverse image search and analyzed it using AI-detection tools like Hive Moderation. The image showed clear signs of manipulation, distorted details, and inconsistent lighting. Hive Moderation flagged it as “Likely AI-generated”, confirming it was synthetically created and not a real photograph.

In contrast, verified visuals and information about the Air India Flight AI-171 crash have been published by credible news agencies like The Indian Express and Hindustan Times, confirmed by the aviation authorities. Authentic reports include on-ground video footage and official statements, none of which feature the viral image. This confirms that the circulating photo is unrelated to the actual incident.

Conclusion:
The viral photograph is a fabrication, created by AI, not a real depiction of the Ahmedabad crash. It does not represent factual visuals from the tragedy. It’s essential to rely on verified images from credible news agencies and official investigation reports when discussing such sensitive events.
- Claim: An Air India Boeing aircraft crashed into a building near Ahmedabad airport
- Claimed On: Social Media
- Fact Check: False and Misleading
Related Blogs

Executive Summary:
A video clip being circulated on social media allegedly shows the Hon’ble President of India, Smt. Droupadi Murmu, the TV anchor Anjana Om Kashyap and the Hon’ble Chief Minister of Uttar Pradesh, Shri Yogi Adityanath promoting a medicine for diabetes. While The CyberPeace Research Team did a thorough investigation, the claim was found to be not true. The video was digitally edited, with original footage of the heavy weight persons being altered to falsely suggest their endorsement of the medication. Specific discrepancies were found in the lip movements and context of the clips which indicated AI Manipulation. Additionally, the distinguished persons featured in the video were actually discussing unrelated topics in their original footage. Therefore, the claim that the video shows endorsements of a diabetes drug by such heavy weights is debunked. The conclusion drawn from the analysis is that the video is an AI creation and does not reflect any genuine promotion. Furthermore, it's also detected by AI voice detection tools.

Claims:
A video making the rounds on social media purporting to show the Hon'ble President of India, Smt. Draupadi Murmu, TV anchor Anjana Om Kashyap, and Hon'ble Chief Minister of Uttar Pradesh Shri Yogi Adityanath giving their endorsement to a diabetes medicine.

Fact Check:
Upon receiving the post we carefully watched the video and certainly found some discrepancies between lip synchronization and the word that we can hear. Also the voice of Chief Minister of Uttar Pradesh Shri Yogi Adityanath seems to be suspicious which clearly indicates some sign of fabrication. In the video, we can hear Hon'ble President of India Smt. Droupadi Murmu endorses a medicine that cured her diabetes. We then divided the video into keyframes, and reverse-searched one of the frames of the video. We landed on a video uploaded by Aaj Tak on their official YouTube Channel.

We found something similar to the same viral video, we can see the courtesy written as Sansad TV. Taking a cue from this we did some keyword searches and found another video uploaded by the YouTube Channel Sansad TV. In this video, we found no mention of any diabetes medicine. It was actually the Swearing in Ceremony of the Hon’ble President of India, Smt. Droupadi Murmu.

In the second part, there was a man addressed as Dr. Abhinash Mishra who allegedly invented the medicine that cures diabetes. We reverse-searched the image of that person and landed at a CNBC news website where the same face was identified as Dr Atul Gawande who is a professor at Harvard School of Public Health. We watched the video and found no sign of endorsing or talking about any diabetes medicine he invented.

We also extracted the audio from the viral video and analyzed it using the AI audio detection tool named Eleven Labs, which found the audio very likely to be created using the AI Voice generation tool with the probability of 98%.

Hence, the Claim made in the viral video is false and misleading. The Video is digitally edited using different clips and the audio is generated using the AI Voice creation tool to mislead netizens. It is worth noting that we have previously debunked such voice-altered news with bogus claims.
Conclusion:
In conclusion, the viral video claiming that Hon'ble President of India, Smt. Droupadi Murmu and Chief Minister of Uttar Pradesh Shri Yogi Adityanath promoted a diabetes medicine that cured their diabetes, is found to be false. Upon thorough investigation it was found that the video is digitally edited from different clips, the clip of Hon'ble President of India, Smt. Droupadi Murmu is taken from the clip of Oath Taking Ceremony of 15th President of India and the claimed doctor Abhinash Mishra whose video was found in CNBC News Outlet. The real name of the person is Dr. Atul Gawande who is a professor at Harvard School of Public Health. Online users must be careful while receiving such posts and should verify before sharing them with others.
Claim: A video is being circulated on social media claiming to show distinguished individuals promoting a particular medicine for diabetes treatment.
Claimed on: Facebook
Fact Check: Fake & Misleading

Executive Summary:
A picture about the April 8 solar eclipse, which was authored by AI and was not a real picture of the astronomical event, has been spreading on social media. Despite all the claims of the authenticity of the image, the CyberPeace’s analysis showed that the image was made using Artificial Intelligence image-creation algorithms. The total solar eclipse on April 8 was observable only in those places on the North American continent that were located in the path of totality, whereas a partial visibility in other places was possible. NASA made the eclipse live broadcast for people who were out of the totality path. The spread of false information about rare celestial occurrences, among others, necessitates relying on trustworthy sources like NASA for correct information.
Claims:
An image making the rounds through social networks, looks like the eclipse of the sun of the 8th of April, which makes it look like a real photograph.




Fact Check:
After receiving the news, the first thing we did was to try with Keyword Search to find if NASA had posted any lookalike image related to the viral photo or any celestial events that might have caused this photo to be taken, on their official social media accounts or website. The total eclipse on April 8 was experienced by certain parts of North America that were located in the eclipse pathway. A part of the sky above Mazatlan, Mexico, was the first to witness it. Partial eclipse was also visible for those who were not in the path of totality.
Next, we ran the image through the AI Image detection tool by Hive moderation, which found it to be 99.2% AI-generated.

Following that, we applied another AI Image detection tool called Isitai, and it found the image to be 96.16% AI-generated.

With the help of AI detection tools, we came to the conclusion that the claims made by different social media users are fake and misleading. The viral image is AI-generated and not a real photograph.
Conclusion:
Hence, it is a generated image by AI that has been circulated on the internet as a real eclipse photo on April 8. In spite of some debatable claims to the contrary, the study showed that the photo was created using an artificial intelligence algorithm. The total eclipse was not visible everywhere in North America, but rather only in a certain part along the eclipse path, with partial visibility elsewhere. Through AI detection tools, we were able to establish a definite fact that the image is fake. It is very important, when you are talking about rare celestial phenomena, to use the information that is provided by the trusted sources like NASA for the accurate reason.
- Claim: A viral image of a solar eclipse claiming to be a real photograph of the celestial event on April 08
- Claimed on: X, Facebook, Instagram, website
- Fact Check: Fake & Misleading

Introduction
In the labyrinthine corridors of the digital age, where information zips across the globe with the ferocity of a tempest, the truth often finds itself ensnared in a web of deception. It is within this intricate tapestry of reality and falsehood that we find ourselves examining two distinct yet equally compelling cases of misinformation, each a testament to the pervasive challenges that beset our interconnected world.
Case 1: The Deceptive Video: Originating in Malaysia, Misattributed to Indian Railway Development
A misleading video claiming to showcase Indian railway construction has been debunked as footage from Malaysia's East Coast Rail Link (ECRL). Fact-checking efforts by India TV traced the video's origin to Malaysia, revealing deceptive captions in Tamil and Hindi. The video was initially posted on Twitter on January 9, 2024, announcing the commencement of track-laying for Malaysia's East Coast Railway. Further investigation reveals the ECRL as a joint venture between Malaysia and China, involving the laying of tracks along the east coast, challenging assertions of Indian railway development. The ECRL's track-laying initiative, initiated in December 2023, is part of China's Belt and Road initiative, covering 665 kilometers across states like Kelantan, Terengganu, Pahang, and Selangor, with a completion target set for 2025.
The video in question, a digital chameleon, had its origins not in the bustling landscapes of India but within the verdant bounds of Malaysia. Specifically, it was a scene captured from the East Coast Rail Link (ECRL) project, a monumental joint venture between Malaysia and China, unfurling across 665 kilometers of Malaysian terrain. This ambitious endeavor, part of the grand Belt and Road initiative, is a testament to the collaborative spirit that defines our era, with tracks stretching from Kelantan to Selangor, and a completion horizon set for the year 2025.
The unveiling of this grand project was graced by none other than Malaysia’s King Sultan Abdullah Sultan Ahmad Shah, in Pahang, underscoring the strategic alliance with China and the infrastructural significance of the ECRL. Yet, despite the clarity of its origins, the video found itself cloaked in a narrative of Indian development, a falsehood that spread like wildfire across the digital savannah.
Through the meticulous application of keyframe analysis and reverse image searches, the truth was laid bare. Reports from reputable sources such as the Associated Press and the Global Times, featuring the very same machinery, corroborated the video's true lineage. This revelation not only highlighted the ECRL's geopolitical import but also served as a clarion call for the critical role of fact-checking in an era where misinformation proliferates with reckless abandon.
Case 2: Kerala's Incident: Investigating Fake Narratives
Kerala Chief Minister Pinarayi Vijayan has registered 53 cases related to spreading fake narratives on social media to incite communal sentiments following the blasts at a Christian religious gathering in October 2023. Vijayan said cases have been registered against online news portals, editors, and Malayalam television channels. The state police chief has issued directions to monitor social media to stop fake news spread and take appropriate actions.
In a different corner of the world, the serene backdrop of Kerala was shattered by an event that would ripple through the fabric of its society. The Kalamassery blast, a tragic occurrence at a Christian religious gathering, claimed the lives of eight individuals and left over fifty wounded. In the wake of this calamity, a man named Dominic Martin surrendered, claiming responsibility for the heinous act.
Yet, as the investigation unfolded, a different kind of violence emerged—one that was waged not with explosives but with words. A barrage of fake narratives began to circulate through social media, igniting communal tensions and distorting the narrative of the incident. The Kerala Chief Minister, Pinarayi Vijayan, informed the Assembly that 53 cases had been registered across the state, targeting individuals and entities that had fanned the flames of discord through their digital utterances.
The Kerala police, vigilant guardians of truth, embarked on a digital crusade to quell the spread of these communally instigative messages. With a particular concentration of cases in Malappuram district, the authorities worked tirelessly to dismantle the network of fake profiles that propagated religious hatred. Social media platforms were directed to assist in this endeavor, revealing the IP addresses of the culprits and enabling the cyber cell divisions to take decisive action.
In the aftermath of the blasts, the Chief Minister and the state police chief ordered special instructions to monitor social media platforms for content that could spark communal uproar. Cyber patrolling became the order of the day, as a 20-member probe team was constituted to deeply investigate the incident.
Conclusion
These two cases, disparate in their nature and geography, converge on a singular point: the fragility of truth in the digital age. They highlight the imperative for vigilance and the pursuit of accuracy in a world where misinformation can spread like wildfire. As we navigate this intricate cyberscape, it is imperative to be mindful of the power of fact-checking and the importance of media literacy, for they are the light that guides us through the fog of falsehoods to the shores of veracity.
These narratives are not merely stories of deception thwarted; they are a call to action, a reminder of our collective responsibility to safeguard the integrity of our shared reality. Let us, therefore, remain steadfast in our quest for the truth, for it is only through such diligence that we can hope to preserve the sanctity of our discourse and the cohesion of our societies.
References:
- https://www.indiatvnews.com/fact-check/fact-check-misleading-video-claims-malaysian-rail-project-indian-truth-ecrl-india-railway-development-pm-modi-2024-01-29-914282
- https://sahilonline.org/kalamasserry-blast-53-cases-registered-across-kerala-for-spreading-fake-news