Navigating the Path to CyberPeace: Insights and Strategies
Featured #factCheck Blogs

Executive Summary
A video is going viral on social media showing a massive building engulfed in flames and collapsing into debris. It is being widely claimed that Iran launched a powerful attack that destroyed Israel’s army headquarters. However, research by CyberPeace reveals that this claim is misleading. The viral video is AI-generated and has no connection to any real-world event.
Claim
An X (formerly Twitter) user shared the viral video with the caption: “Iran has targeted Israel’s army headquarters. It seems Israel’s dream of becoming ‘Greater Israel’ will remain unfulfilled.”
Post link:
- https://x.com/KAMESHKUMAR96/status/2039009484069368083
Archived version:
- https://archive.ph/HKXkK
- https://x.com/KAMESHKUMAR96/status/2039009484069368083
- https://archive.ph/HKXkK

Similar videos have also been shared by other users on social media:
Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search. During this process, we found several credible media reports confirming that Iran has carried out drone and missile attacks on Israel and the Gulf regions in recent times. However, none of these reports featured the viral video, indicating that it is not authentic footage.

- https://www.youtube.com/watch?v=fxDBX90bYng

A closer examination of the video revealed multiple visual inconsistencies commonly associated with AI-generated content. For instance, a building on the left side appears to bend and collapse in a rubber-like manner—something that is physically unrealistic for structures made of concrete and steel. Additionally, the smoke and flames appear unnatural and lack realistic dynamics.
To further verify, we analyzed the video using the AI detection tool Hive Moderation, which classified it as 99.9% AI-generated.

We also tested the video using the Deepfake-o-Meter platform.The AVSRDD (2025) model detected it as 99.5% AI-generated

Conclusion
Our research clearly establishes that the viral video claiming Iran destroyed Israel’s army headquarters is false and misleading. The footage does not appear in any credible news coverage of recent attacks, which strongly indicates that it is not real. Moreover, multiple AI detection tools consistently classify the video as artificially generated, with extremely high probability scores. Visual anomalies in the clip further support this finding.

Executive Summary:
A video is rapidly circulating on social media following claims that Iran’s national security chief Ali Larijani was killed in an Israeli airstrike. The viral clip is being shared with the assertion that it shows the moment Israel launched a powerful attack on Iran to eliminate Larijani, allegedly shaking the ground due to the intensity of the strike However, research by CyberPeace has found the claim to be misleading. The viral video is AI-generated and has no connection to real-world events.
Claim:
Social media users have shared the video with alarming captions. One such post by Deepak Sharma reads:
“WAR UPDATE… Iran is in its final phase… Israel is striking selectively… This attack will leave you shocked… Iran’s national security chief Ali Larijani has been killed in this attack… The intensity of the strike shook the Iranian الأرض.
Post links:

Similar videos were also shared by other users:
- urabh_raj3026/status/2038834832869032026
- https://x.com/ibmindia20/status/2038938020154597447
- https://x.com/Saurabh_raj3026/status/2038834832869032026
Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search. During this process, we found the same video on Instagram, uploaded on March 9, 2026, by the account “_iranwar_2026” without any descriptive caption.

According to a BBC report, Ali Larijani died on March 17 in an Israeli strike. This establishes that the viral video predates the reported incident, making the claim factually inconsistent. Further examination of the Instagram account revealed that it frequently shares pro-Iran content, including gaming visuals and AI-generated clips, raising doubts about the authenticity of the video.

To strengthen the verification, we analyzed the viral clip using the AI detection tool “Zhuque AI Detection Assistant.” The results indicated a 91.71% probability that the video is AI-generated, confirming that it is not real footage.

Conclusion
The viral claim linking the video to an Israeli airstrike that allegedly killed Ali Larijani is misleading and factually incorrect. Multiple layers of verification show that the video existed online before the reported incident, ruling out any direct connection. Additionally, AI detection analysis strongly suggests that the video is artificially generated. The source account’s pattern of sharing AI and gaming-related content further weakens the credibility of the claim. There is no verified evidence to support that the viral clip depicts a real attack or any event related to Larijani’s death. Instead, the video appears to be a digitally created visual circulated without context to amplify misinformation.

Executive Summary:
Recently, a viral social media post alleged that the Delhi Metro Rail Corporation Ltd. (DMRC) had increased ticket prices following the BJP’s victory in the Delhi Legislative Assembly elections. After thorough research and verification, we have found this claim to be misleading and entirely baseless. Authorities have asserted that no fare hike has been declared.
Claim:
Viral social media posts have claimed that the Delhi Metro Rail Corporation Ltd. (DMRC) increased metro fares following the BJP's victory in the Delhi Legislative Assembly elections.


Fact Check:
After thorough research, we conclude that the claims regarding a fare hike by the Delhi Metro Rail Corporation Ltd. (DMRC) following the BJP’s victory in the Delhi Legislative Assembly elections are misleading. Our review of DMRC’s official website and social media handles found no mention of any fare increase.Furthermore, the official X (formerly Twitter) handle of DMRC has also clarified that no such price hike has been announced. We urge the public to rely on verified sources for accurate information and refrain from spreading misinformation.

Conclusion:
Upon examining the alleged fare hike, it is evident that the increase pertains to Bengaluru, not Delhi. To verify this, we reviewed the official website of Bangalore Metro Rail Corporation Limited (BMRCL) and cross-checked the information with appropriate evidence, including relevant images. Our findings confirm that no fare hike has been announced by the Delhi Metro Rail Corporation Ltd. (DMRC).

- Claim: Delhi Metro price Hike after BJP’s victory in election
- Claimed On: X (Formerly Known As Twitter)
- Fact Check: False and Misleading

Executive Summary:
Recent reports circulating on various social media platforms have falsely claimed that an air taxi prototype is operational and providing services between Amritsar, Chandigarh, Delhi, and Jaipur. These claims, accompanied by images and videos, have been widely shared, leading to significant public attention. However, upon conducting a thorough examination using reverse image search, it has been determined that the information is misleading and inaccurate. These assertions do not reflect the current reality and are not substantiated by credible sources

Claim:
The claim suggests that an air taxi prototype is already operational, servicing routes between Amritsar, Chandigarh, Delhi, and Jaipur. This assertion is accompanied by images of a futuristic aircraft, implying that such technology is currently being used to transport commercial passengers.

Fact Check:
The claim of air taxi and routes between Amritsar, Chandigarh, Delhi, and Jaipur has been found to be misleading. Also, so far, neither the Indian government nor the respective aviation authorities have issued any sort of public declarations nor industry insiders to claim any launch of any air taxi service. Further research followed a keyword-based search that directed us to a news report published in The Times of India on January 20, 2025. A similar post to the one seen in the viral video accompanied the report. It stated that Bengaluru-based aerospace startup Sarla Aviation launched its prototype air taxi called “Shunya” during the Bharat Mobility Global Expo. Under this plan, it looks to initiate electric flying taxis in Bangalore by 2028. This urban air transport program for India will be similar to what they are posting in this regard.

Conclusion:
The viral claim saying that there is an air taxi service in India between Amritsar, Chandigarh, Delhi, and Jaipur is entirely false. The pictures and information going viral are misleading and do not relate to any progress or implementation of air taxi technology in India. To date, there is no official confirmation or credible evidence that supports such a service. Information must be verified from reliable sources before it is believed or shared in order to prevent the spread of misinformation.
- Claim: A viral post claims an air taxi is operational between Amritsar, Chandigarh, Delhi, and Jaipur.
- Claimed On: Social Media
- Fact Check: False and Misleading

Executive Summary:
Recently, our team encountered a post on X (formerly Twitter) pretending Chandra Arya, a Member of Parliament of Canada is speaking in Kannada and this video surfaced after he filed his nomination for the much-coveted position of Prime Minister of Canada. The video has taken the internet by storm and is being discussed as much as words can be. In this report, we shall consider the legitimacy of the above claim by examining the content of the video, timing and verifying information from reliable sources.

Claim:
The viral video claims Chandra Arya spoke Kannada after filing his nomination for the Canadian Prime Minister position in 2025, after the resignation of Justin Trudeau.

Fact Check:
Upon receiving the video, we performed a reverse image search of the key frames extracted from the video, we found that the video has no connection to any nominations for the Canadian Prime Minister position.Instead, we found that it was an old video of his speech in the Canadian Parliament in 2022. Simultaneously, an old post from the X (Twitter) handle of Mr. Arya’s account was posted at 12:19 AM, May 20, 2022, which clarifies that the speech has no link with the PM Candidature post in the Canadian Parliament.
Further our research led us to a YouTube video posted on a verified channel of Hindustan Times dated 20th May 2022 with a caption -
“India-born Canadian MP Chandra Arya is winning hearts online after a video of his speech at the Canadian Parliament in Kannada went viral. Arya delivered a speech in his mother tongue - Kannada. Arya, who represents the electoral district of Nepean, Ontario, in the House of Commons, the lower house of Canada, tweeted a video of his address, saying Kannada is a beautiful language spoken by about five crore people. He said that this is the first time when Kannada is spoken in any Parliament outside India. Netizens including politicians have lauded Arya for the video.”

Conclusion:
The viral video claiming that Chandra Arya spoke in Kannada after filing his nomination for the Canadian Prime Minister position in 2025 is completely false. The video, dated May 2022, shows Chandra Arya delivering an address in Kannada in the Canadian Parliament, unrelated to any political nominations or events concerning the Prime Minister's post. This incident highlights the need for thorough fact-checking and verifying information from credible sources before sharing.
- Claim: Misleading Claim About Chandra Arya’s PM Candidacy
- Claimed on: X (Formerly Known As Twitter)
- Fact Check: False and Misleading

Executive Summary:
Recently, our team came across a video on social media that appears to show a saint lying in a fire during the Mahakumbh 2025. The video has been widely viewed and comes with captions claiming that it is part of a ritual during the ongoing Mahakumbh 2025. After thorough research, we found that these claims are false. The video is unrelated to Mahakumbh 2025 and comes from a different context and location. This is an example of how the information posted was from the past and not relevant to the alleged context.

Claim:
A video has gone viral on social media, claiming to show a saint lying in fire during Mahakumbh 2025, suggesting that this act is part of the traditional rituals associated with the ongoing festival. This misleading claim falsely implies that the act is a standard part of the sacred ceremonies held during the Mahakumbh event.

Fact Check:
Upon receiving the post we conducted a reverse image search of the key frames extracted from the video, and traced the video to an old article. Further research revealed that the original post was from 2009, when Ramababu Swamiji, aged 80, laid down on a burning fire for the benefit of society. The video is not recent, as it had already gone viral on social media in November 2009. A closer examination of the scene, crowd, and visuals clearly shows that the video is unrelated to the rituals or context of Mahakumbh 2025. Additionally, our research found that such activities are not part of the Mahakumbh rituals. Reputable sources were also kept into consideration to cross-verify this information, effectively debunking the claim and emphasizing the importance of verifying facts before believing in anything.


For more clarity, the YouTube video attached below further clears the doubt, which reminds us to verify whether such claims are true or not.

Conclusion:
The viral video claiming to depict a saint lying in fire during Mahakumbh 2025 is entirely misleading. Our thorough fact-checking reveals that the video dates back to 2009 and is unrelated to the current event. Such misinformation highlights the importance of verifying content before sharing or believing it. Always rely on credible sources to ensure the accuracy of claims, especially during significant cultural or religious events like Mahakumbh.
- Claim: A viral video claims to show a saint lying in fire during the Mahakumbh 2025.
- Claimed On: X (Formerly Known As Twitter)
- Fact Check: False and Misleading

Executive Summary:
A viral post on X (formerly Twitter) has been spreading misleading captions about a video that falsely claims to depict severe wildfires in Los Angeles similar to the real wildfire happening in Los Angeles. Using AI Content Detection tools we confirmed that the footage shown is entirely AI-generated and not authentic. In this report, we’ll break down the claims, fact-check the information, and provide a clear summary of the misinformation that has emerged with this viral clip.

Claim:
A video shared across social media platforms and messaging apps alleges to show wildfires ravaging Los Angeles, suggesting an ongoing natural disaster.

Fact Check:
After taking a close look at the video, we noticed some discrepancy such as the flames seem unnatural, the lighting is off, some glitches etc. which are usually seen in any AI generated video. Further we checked the video with an online AI content detection tool hive moderation, which says the video is AI generated, meaning that the video was deliberately created to mislead viewers. It’s crucial to stay alert to such deceptions, especially concerning serious topics like wildfires. Being well-informed allows us to navigate the complex information landscape and distinguish between real events and falsehoods.

Conclusion:
This video claiming to display wildfires in Los Angeles is AI generated, the case again reflects the importance of taking a minute to check if the information given is correct or not, especially when the matter is of severe importance, for example, a natural disaster. By being careful and cross-checking of the sources, we are able to minimize the spreading of misinformation and ensure that proper information reaches those who need it most.
- Claim: The video shows real footage of the ongoing wildfires in Los Angeles, California
- Claimed On: X (Formerly Known As Twitter)
- Fact Check: Fake Video

Executive Summary:
A viral post on X (formerly Twitter) gained much attention, creating a false narrative of recent damage caused by the earthquake in Tibet. Our findings confirmed that the clip was not filmed in Tibet, instead it came from an earthquake that occurred in Japan in the past. The origin of the claim is traced in this report. More to this, analysis and verified findings regarding the evidence have been put in place for further clarification of the misinformation around the video.

Claim:
The viral video shows collapsed infrastructure and significant destruction, with the caption or claims suggesting it is evidence of a recent earthquake in Tibet. Similar claims can be found here and here

Fact Check:
The widely circulated clip, initially claimed to depict the aftermath of the most recent earthquake in Tibet, has been rigorously analyzed and proven to be misattributed. A reverse image search based on the Keyframes of the claimed video revealed that the footage originated from a devastating earthquake in Japan in the past. According to an article published by a Japanese news website, the incident occurred in February 2024. The video was authenticated by news agencies, as it accurately depicted the scenes of destruction reported during that event.

Moreover, the same video was already uploaded on a YouTube channel, which proves that the video was not recent. The architecture, the signboards written in Japanese script, and the vehicles appearing in the video also prove that the footage belongs to Japan, not Tibet. The video shows news from Japan that occurred in the past, proving the video was shared with different context to spread false information.

The video was uploaded on February 2nd, 2024.
Snap from viral video

Snap from Youtube video

Conclusion:
The video viral about the earthquake recently experienced by Tibet is, therefore, wrong as it appears to be old footage from Japan, a previous earthquake experienced by this nation. Thus, the need for information verification, such that doing this helps the spreading of true information to avoid giving false data.
- Claim: A viral video claims to show recent earthquake destruction in Tibet.
- Claimed On: X (Formerly Known As Twitter)
- Fact Check: False and Misleading
.webp)
Executive Summary:
Recently, a viral post on social media claiming that actor Allu Arjun visited a Shiva temple to pray in celebration after the success of his film, PUSHPA 2. The post features an image of him visiting the temple. However, an investigation has determined that this photo is from 2017 and does not relate to the film's release.

Claims:
The claim states that Allu Arjun recently visited a Shiva temple to express his thanks for the success of Pushpa 2, featuring a photograph that allegedly captures this moment.

Fact Check:
The image circulating on social media, that Allu Arjun visited a Shiva temple to celebrate the success of Pushpa 2, is misleading.
After conducting a reverse image search, we confirmed that this photograph is from 2017, taken during the actor's visit to the Tirumala Temple for a personal event, well before Pushpa 2 was ever announced. The context has been altered to falsely connect it to the film's success. Additionally, there is no credible evidence or recent reports to support the claim that Allu Arjun visited a temple for this specific reason, making the assertion entirely baseless.

Before sharing viral posts, take a brief moment to verify the facts. Misinformation spreads quickly and it’s far better to rely on trusted fact-checking sources.
Conclusion:
The claim that Allu Arjun visited a Shiva temple to celebrate the success of Pushpa 2 is false. The image circulating is actually from an earlier time. This situation illustrates how misinformation can spread when an old photo is used to construct a misleading story. Before sharing viral posts, take a moment to verify the facts. Misinformation spreads quickly, and it is far better to rely on trusted fact-checking sources.
- Claim: The image claims Allu Arjun visited Shiva temple after Pushpa 2’s success.
- Claimed On: Facebook
- Fact Check: False and Misleading
.webp)
Executive Summary:
A post on X (formerly Twitter) has gained widespread attention, featuring an image inaccurately asserting that Houthi rebels attacked a power plant in Ashkelon, Israel. This misleading content has circulated widely amid escalating geopolitical tensions. However, investigation shows that the footage actually originates from a prior incident in Saudi Arabia. This situation underscores the significant dangers posed by misinformation during conflicts and highlights the importance of verifying sources before sharing information.

Claims:
The viral video claims to show Houthi rebels attacking Israel's Ashkelon power plant as part of recent escalations in the Middle East conflict.

Fact Check:
Upon receiving the viral posts, we conducted a Google Lens search on the keyframes of the video. The search reveals that the video circulating online does not refer to an attack on the Ashkelon power plant in Israel. Instead, it depicts a 2022 drone strike on a Saudi Aramco facility in Abqaiq. There are no credible reports of Houthi rebels targeting Ashkelon, as their activities are largely confined to Yemen and Saudi Arabia.

This incident highlights the risks associated with misinformation during sensitive geopolitical events. Before sharing viral posts, take a brief moment to verify the facts. Misinformation spreads quickly and it’s far better to rely on trusted fact-checking sources.
Conclusion:
The assertion that Houthi rebels targeted the Ashkelon power plant in Israel is incorrect. The viral video in question has been misrepresented and actually shows a 2022 incident in Saudi Arabia. This underscores the importance of being cautious when sharing unverified media. Before sharing viral posts, take a moment to verify the facts. Misinformation spreads quickly, and it is far better to rely on trusted fact-checking sources.
- Claim: The video shows massive fire at Israel's Ashkelon power plant
- Claimed On:Instagram and X (Formerly Known As Twitter)
- Fact Check: False and Misleading
.webp)
Executive Summary:
A viral online video claims to show a Syrian prisoner experiencing sunlight for the first time in 13 years. However, the CyberPeace Research Team has confirmed that the video is a deep fake, created using AI technology to manipulate the prisoner’s facial expressions and surroundings. The original footage is unrelated to the claim that the prisoner has been held in solitary confinement for 13 years. The assertion that this video depicts a Syrian prisoner seeing sunlight for the first time is false and misleading.

Claim A viral video falsely claims that a Syrian prisoner is seeing sunlight for the first time in 13 years.


Factcheck:
Upon receiving the viral posts, we conducted a Google Lens search on keyframes from the video. The search led us to various legitimate sources featuring real reports about Syrian prisoners, but none of them included any mention of such an incident. The viral video exhibited several signs of digital manipulation, prompting further investigation.

We used AI detection tools, such as TrueMedia, to analyze the video. The analysis confirmed with 97.0% confidence that the video was a deepfake. The tools identified “substantial evidence of manipulation,” particularly in the prisoner’s facial movements and the lighting conditions, both of which appeared artificially generated.


Additionally, a thorough review of news sources and official reports related to Syrian prisoners revealed no evidence of a prisoner being released from solitary confinement after 13 years, or experiencing sunlight for the first time in such a manner. No credible reports supported the viral video’s claim, further confirming its inauthenticity.
Conclusion:
The viral video claiming that a Syrian prisoner is seeing sunlight for the first time in 13 years is a deep fake. Investigations using tools like Hive AI detection confirm that the video was digitally manipulated using AI technology. Furthermore, there is no supporting information in any reliable sources. The CyberPeace Research Team confirms that the video was fabricated, and the claim is false and misleading.
- Claim: Syrian prisoner sees sunlight for the first time in 13 years, viral on social media.
- Claimed on: Facebook and X(Formerly Twitter)
- Fact Check: False & Misleading

Executive Summary:
A post on X (formerly Twitter) featuring an image that has been widely shared with misleading captions, claiming to show men riding an elephant next to a tiger in Bihar, India. This post has sparked both fascination and skepticism on social media. However, our investigation has revealed that the image is misleading. It is not a recent photograph; rather, it is a photo of an incident from 2011. Always verify claims before sharing.

Claims:
An image purporting to depict men riding an elephant next to a tiger in Bihar has gone viral, implying that this astonishing event truly took place.

Fact Check:
After investigation of the viral image using Reverse Image Search shows that it comes from an older video. The footage shows a tiger that was shot after it became a man-eater by forest guard. The tiger killed six people and caused panic in local villages in the Ramnagar division of Uttarakhand in January, 2011.

Before sharing viral posts, take a brief moment to verify the facts. Misinformation spreads quickly and it’s far better to rely on trusted fact-checking sources.
Conclusion:
The claim that men rode an elephant alongside a tiger in Bihar is false. The photo presented as recent actually originates from the past and does not depict a current event. Social media users should exercise caution and verify sensational claims before sharing them.
- Claim: The video shows people casually interacting with a tiger in Bihar
- Claimed On:Instagram and X (Formerly Known As Twitter)
- Fact Check: False and Misleading

Executive Summary:
A social media viral post claims to show a mosque being set on fire in India, contributing to growing communal tensions and misinformation. However, a detailed fact-check has revealed that the footage actually comes from Indonesia. The spread of such misleading content can dangerously escalate social unrest, making it crucial to rely on verified facts to prevent further division and harm.

Claim:
The viral video claims to show a mosque being set on fire in India, suggesting it is linked to communal violence.

Fact Check
The investigation revealed that the video was originally posted on 8th December 2024. A reverse image search allowed us to trace the source and confirm that the footage is not linked to any recent incidents. The original post, written in Indonesian, explained that the fire took place at the Central Market in Luwuk, Banggai, Indonesia, not in India.

Conclusion: The viral claim that a mosque was set on fire in India isn’t True. The video is actually from Indonesia and has been intentionally misrepresented to circulate false information. This event underscores the need to verify information before spreading it. Misinformation can spread quickly and cause harm. By taking the time to check facts and rely on credible sources, we can prevent false information from escalating and protect harmony in our communities.
- Claim: The video shows a mosque set on fire in India
- Claimed On: Social Media
- Fact Check: False and Misleading

Executive Summary:
A viral online image claims to show Arvind Kejriwal, Chief Minister of Delhi, welcoming Elon Musk during his visit to India to discuss Delhi’s administrative policies. However, the CyberPeace Research Team has confirmed that the image is a deep fake, created using AI technology. The assertion that Elon Musk visited India to discuss Delhi’s administrative policies is false and misleading.


Claim
A viral image claims that Arvind Kejriwal welcomed Elon Musk during his visit to India to discuss Delhi’s administrative policies.


Fact Check:
Upon receiving the viral posts, we conducted a reverse image search using InVid Reverse Image searching tool. The search traced the image back to different unrelated sources featuring both Arvind Kejriwal and Elon Musk, but none of the sources depicted them together or involved any such event. The viral image displayed visible inconsistencies, such as lighting disparities and unnatural blending, which prompted further investigation.
Using advanced AI detection tools like TrueMedia.org and Hive AI Detection tool, we analyzed the image. The analysis confirmed with 97.5% confidence that the image was a deepfake. The tools identified “substantial evidence of manipulation,” particularly in the merging of facial features and the alignment of clothes and background, which were artificially generated.




Moreover, a review of official statements and credible reports revealed no record of Elon Musk visiting India to discuss Delhi’s administrative policies. Neither Arvind Kejriwal’s office nor Tesla or SpaceX made any announcement regarding such an event, further debunking the viral claim.
Conclusion:
The viral image claiming that Arvind Kejriwal welcomed Elon Musk during his visit to India to discuss Delhi’s administrative policies is a deep fake. Tools like Reverse Image search and AI detection confirm the image’s manipulation through AI technology. Additionally, there is no supporting evidence from any credible sources. The CyberPeace Research Team confirms the claim is false and misleading.
- Claim: Arvind Kejriwal welcomed Elon Musk to India to discuss Delhi’s administrative policies, viral on social media.
- Claimed on: Facebook and X(Formerly Twitter)
- Fact Check: False & Misleading