Navigating the Path to CyberPeace: Insights and Strategies
Featured #factCheck Blogs

Executive Summary
An image circulating on social media claims to show Suryakumar Yadav, captain of the Indian cricket team, extending his hand to greet Pakistan’s skipper Salman Ali Agha, who allegedly refused the gesture during the India–Pakistan T20 World Cup match held on February 15. Users shared the image as evidence of a real incident from the high-profile clash. However, a research by CyberPeace found that the image is AI-generated and was falsely circulated to mislead viewers.
Claim
On February 15, an X account named “@iffiViews,” reportedly operated from Pakistan, shared the image claiming it was taken during the India–Pakistan T20 World Cup match at the R. Premadasa Stadium in Colombo. The viral image appeared to show Yadav attempting to shake hands with Agha, who seemed to decline the gesture. The post quickly gained significant traction online, attracting around one million views at the time of reporting. Here is the link and archive link to the post, along with a screenshot.
- https://x.com/iffiViews/status/2023024665770484206?s=20
- https://archive.ph/xvtBs

Fact Check:
To verify the authenticity of the image, researchers closely examined the visual and identified a watermark associated with an AI image-generation tool. This raised strong indications that the image was digitally created and did not depict an actual event.

The image was further analysed using an AI detection tool, which indicated a 99.9 percent probability that the content was artificially generated or manipulated.

Researchers also conducted keyword searches to check whether the two captains had exchanged a handshake during the match. The search revealed media reports confirming that the traditional handshake between players has been discontinued since the Asia Cup 2025 in both men’s and women’s cricket. A report published by The Times of India on February 15 confirmed that no such customary exchange took place during the match between the two teams in Colombo.

Conclusion
The viral image claiming to show Suryakumar Yadav attempting to shake hands with Salman Ali Agha is not authentic. The visual is AI-generated and has been shared online with misleading claims.

Executive Summary
A video showing a flyover collapse is going viral on social media. The clip shows a flyover and a road passing beneath it, with vehicles moving normally. Suddenly, a portion of the flyover appears to collapse and fall onto the road below, with some vehicles seemingly coming under its impact. The video has been widely shared by users online. However, research by the CyberPeace found the viral claim to be false. The probe revealed that the video is not real but has been created using artificial intelligence.
Claim:
On X (formerly Twitter), a user shared the viral video on February 13, 2026, claiming it showed the reality of India’s infrastructure development and criticizing ongoing projects. The post quickly gained traction, with several users sharing it as a real incident. Similarly, another user shared the same video on Facebook on February 13, 2026, making a similar claim.

Fact Check:
To verify the claim, key frames from the viral video were extracted and searched using Google Lens. During the search, the video was traced to an account named “sphereofai” on Instagram, where it had been posted on February 9. The post included hashtags such as “AI Creator” and “AI Generated,” clearly indicating that the video was created using AI. Further examination of the account showed that the user identifies themselves as an AI content creator.


To confirm the findings, the viral video was also analysed using Hive Moderation. The tool’s analysis suggested a 99 percent probability that the video was AI-generated.

Conclusion:
The research established that the viral flyover collapse video is not authentic. It is an AI-generated clip being circulated online with misleading claims.

Executive Summary
An image circulating on social media claims to show Suryakumar Yadav, captain of the Indian cricket team, extending his hand to greet Pakistan’s skipper Salman Ali Agha, who allegedly refused the gesture during the India–Pakistan T20 World Cup match held on February 15. Users shared the image as evidence of a real incident from the high-profile clash. However, a research by CyberPeace found that the image is AI-generated and was falsely circulated to mislead viewers.
Claim
On February 15, an X account named “@iffiViews,” reportedly operated from Pakistan, shared the image claiming it was taken during the India–Pakistan T20 World Cup match at the R. Premadasa Stadium in Colombo. The viral image appeared to show Yadav attempting to shake hands with Agha, who seemed to decline the gesture. The post quickly gained significant traction online, attracting around one million views at the time of reporting. Here is the link and archive link to the post, along with a screenshot.
- https://x.com/iffiViews/status/2023024665770484206?s=20
- https://archive.ph/xvtBs

Fact Check:
To verify the authenticity of the image, researchers closely examined the visual and identified a watermark associated with an AI image-generation tool. This raised strong indications that the image was digitally created and did not depict an actual event.

The image was further analysed using an AI detection tool, which indicated a 99.9 percent probability that the content was artificially generated or manipulated.

Researchers also conducted keyword searches to check whether the two captains had exchanged a handshake during the match. The search revealed media reports confirming that the traditional handshake between players has been discontinued since the Asia Cup 2025 in both men’s and women’s cricket. A report published by The Times of India on February 15 confirmed that no such customary exchange took place during the match between the two teams in Colombo.

Conclusion
The viral image claiming to show Suryakumar Yadav attempting to shake hands with Salman Ali Agha is not authentic. The visual is AI-generated and has been shared online with misleading claims.

Executive Summary
A video showing a flyover collapse is going viral on social media. The clip shows a flyover and a road passing beneath it, with vehicles moving normally. Suddenly, a portion of the flyover appears to collapse and fall onto the road below, with some vehicles seemingly coming under its impact. The video has been widely shared by users online. However, research by the CyberPeace found the viral claim to be false. The probe revealed that the video is not real but has been created using artificial intelligence.
Claim:
On X (formerly Twitter), a user shared the viral video on February 13, 2026, claiming it showed the reality of India’s infrastructure development and criticizing ongoing projects. The post quickly gained traction, with several users sharing it as a real incident. Similarly, another user shared the same video on Facebook on February 13, 2026, making a similar claim.

Fact Check:
To verify the claim, key frames from the viral video were extracted and searched using Google Lens. During the search, the video was traced to an account named “sphereofai” on Instagram, where it had been posted on February 9. The post included hashtags such as “AI Creator” and “AI Generated,” clearly indicating that the video was created using AI. Further examination of the account showed that the user identifies themselves as an AI content creator.


To confirm the findings, the viral video was also analysed using Hive Moderation. The tool’s analysis suggested a 99 percent probability that the video was AI-generated.

Conclusion:
The research established that the viral flyover collapse video is not authentic. It is an AI-generated clip being circulated online with misleading claims.

Executive Summary
A video featuring popular comedian Rajpal Yadav has recently gone viral on social media, claiming that he is currently lodged in Tihar Jail in connection with a loan default and cheque bounce case. In connection with this, another video showing Bollywood superstar Shah Rukh Khan is being widely shared online. In the viral clip, Khan is purportedly seen saying that he would help Rajpal Yadav get out of jail and also offer him a role in his upcoming film. However, research by the CyberPeace found the viral video to be fake. The clip is a deepfake, in which the audio has been manipulated using artificial intelligence. In the original video, Shah Rukh Khan is speaking about his life and personal experiences. Although several prominent Bollywood personalities have expressed support for Rajpal Yadav, the claims made in the viral video are misleading.
Claim
An Instagram user named “ayubeditz” shared the viral video on February 11, 2026, with the caption: “Rajpal Yadav bhai, stay strong, we are all with you — Shah Rukh Khan.” The link to the post and its archived version are provided below.

Fact Check
To verify the claim, we extracted key frames from the viral video and conducted a Google reverse image search. This led us to the original video uploaded on a YouTube channel titled “Locarno Film Festival” on August 11, 2024. According to the available information, Shah Rukh Khan was sharing insights about his life and career during a conversation with the festival’s Artistic Director, Giona A. Nazzaro. This raised strong suspicion that the viral video had been edited using AI.

To further examine the authenticity of the audio, we analysed it using AI detection tools. The audio was first checked using Aurigin.ai, which indicated an 83 percent probability that the voice in the viral clip was AI-generated.

Conclusion
The CyberPeace’s research confirmed that the claim associated with Shah Rukh Khan’s viral video is false. The video is a deepfake in which the audio has been altered using artificial intelligence. In the original footage, Khan was discussing his life and experiences, and he did not make any statement about helping Rajpal Yadav.

Executive Summary
A video of senior Congress leader Shashi Tharoor is widely circulating on social media, allegedly showing him praising Pakistan’s diplomatic stance over the ICC T20 World Cup issue. Many users are sharing the clip believing it to be genuine. However, research by the CyberPeace found the claim to be false. The viral video of Tharoor is a deepfake, and the Congress leader himself has described it as fabricated and fake.
Claim
A Facebook page named “Vok Sports” shared the video on February 11, 2026, claiming that Tharoor praised Pakistan. In the viral clip, he is purportedly heard saying in English that Pakistan’s diplomatic handling of the matter was “brilliant” and that it had outmanoeuvred the Indian cricket board, adding that good diplomacy could make a weak nation appear powerful.
The video was widely shared by social media users as authentic. (Archive links and post details provided.)
Fact Check
To verify the claim, we first scanned Tharoor’s official X (formerly Twitter) handle. We found a post dated February 12 in which he responded to a Pakistani journalist who had shared the video. Tharoor stated that the clip was AI-generated “fake news,” adding that neither the language nor the voice in the video was his.

A reverse image search using Google Lens led the Desk to a video uploaded on February 10, 2026, by India Today on its official YouTube channel. The visuals in this original video exactly matched those seen in the viral clip showing Tharoor speaking to the media. However, upon analysing the original footage, we found that Tharoor was speaking in Hindi about the controversy surrounding the T20 World Cup. He stated that politics should not be mixed with cricket or sports and did not praise Pakistan or the Pakistan Cricket Board at any point. This indicates that the audio in the viral clip had been manipulated and replaced. In the original video, Tharoor said that politicians should conduct politics separately, diplomats should handle diplomacy, and cricket players should focus on the game, expressing hope that cricket would move forward with the match.
- https://www.youtube.com/watch?v=GkA1mLlAT8Q&t=3s

To further verify the authenticity of the video, several AI detection tools were used. Analysis through Aurigin.ai suggested a 78 percent probability that the audio in the viral clip was AI-generated.

Conclusion
The CyberPeace confirmed that the viral video is a deepfake. Tharoor did not praise Pakistan’s diplomatic stance during the T20 World Cup controversy, and the circulating clip has been digitally manipulated.

Executive Summary
A dispute had recently emerged in Kotdwar, Uttarakhand, over the name of a shop. During the controversy, a local youth, Deepak Kumar, came forward in support of the shopkeeper. The incident subsequently became a subject of discussion on social media, with users expressing varied reactions. Meanwhile, a photo began circulating on social media showing a burqa-clad woman presenting a bouquet to Deepak Kumar. The image is being shared with the claim that All India Majlis-e-Ittehadul Muslimeen (AIMIM)’s women’s president, Rubina, welcomed “Mohammad Deepak Kumar” by presenting him with a bouquet. However, research conducted by the CyberPeace found the viral claim to be false. The research revealed that users are sharing an AI-generated image with a misleading claim.
Claim:
On social media platform Instagram, a user shared the viral image claiming that AIMIM’s women’s president Rubina welcomed “Mohammad Deepak Kumar” by presenting him with a bouquet. The link to the post, its archived version, and a screenshot are provided below.

Fact Check:
Upon closely examining the viral image, certain inconsistencies raised suspicion that it could be AI-generated. To verify its authenticity, the image was analysed using the AI detection tool Hive Moderation, which indicated a 96 percent probability that the image was AI-generated.

In the next stage of the research , the image was also analysed using another AI detection tool, Wasit AI, which likewise identified the image as AI-generated.

Conclusion
The research establishes that users are circulating an AI-generated image with a misleading claim linking it to the Kotdwar controversy.

Executive Summary
A picture circulating on social media allegedly shows Reliance Industries chairman Mukesh Ambani and Nita Ambani presenting a luxury car to India’s T20 team captain Suryakumar Yadav. The image is being widely shared with the claim that the Ambani family gifted the cricketer a luxury car in recognition of his outstanding performance. However, research conducted by the CyberPeace found the viral claim to be false. The research revealed that the image being circulated online is not authentic but generated using artificial intelligence (AI).
Claim
On February 8, 2025, a Facebook user shared the viral image claiming that Mukesh Ambani and Nita Ambani gifted a luxury car to Suryakumar Yadav following his brilliant innings. The post has been widely circulated across social media platforms. In another instance, a user shared a collage in which one image shows Suryakumar Yadav receiving an award, while another depicts him with Nita Ambani, further amplifying the claim.
- https://www.facebook.com/61559815349585/posts/122207061746327178/?rdid=0MukeT6c7WK1uB8m#
- https://archive.ph/wip/UH9Xh

Fact Check:
Upon closely examining the viral image, certain visual inconsistencies raised suspicion that it might be AI-generated. To verify its authenticity, the image was analysed using the AI detection tool Hive Moderation, which indicated a 99 percent probability that the image was AI-generated.

In the next step of the research, the image was also analysed using another AI detection tool, Sightengine, which found a 98 percent likelihood that the image was created using artificial intelligence.

Conclusion
The research clearly establishes that the viral image claiming Mukesh Ambani and Nita Ambani gifted a luxury car to Suryakumar Yadav is misleading. The picture is not real and has been generated using AI.

Executive Summary
A shocking video claiming to show snakes raining down from the sky is going viral on social media. The clip shows what appear to be cobras and pythons falling in large numbers instead of rain, while people are seen running in panic through a marketplace. The video is being shared with the claim that it is the result of “tampering with nature” and that sudden snake rainfall occurred in an unidentified country. (Links and archived versions provided)

CyberPeace researched the viral claim and found it to be false. The video does not depict a real incident. Instead, it has been generated using artificial intelligence (AI).
Fact Check
To verify the authenticity of the video, we extracted keyframes and conducted a reverse image search using Google Lens. However, we did not find any credible media report linked to the viral footage. We also searched relevant keywords on Google but found no reliable national or international news coverage supporting the claim. If snakes had genuinely rained from the sky in any country, the incident would have received widespread media attention globally. A frame-by-frame analysis of the video revealed multiple inconsistencies and visual anomalies:
In the first two seconds, a massive snake appears to fall onto electric wires, yet its body passes unrealistically through the wires — something that is physically impossible. The snakes falling from the sky and crawling on the ground move in an unnatural manner. Instead of falling under gravity, they appear to float mid-air. Around the 9–10 second mark, a person lying on the ground has a visibly distorted hand structure, a common artifact seen in AI-generated videos.
Such irregularities are typical indicators of AI-generated content. The viral video was further analyzed using the AI detection tool Hive Moderation, which indicated a 96.5% probability that the video was AI-generated.

Additionally, image detection tool WasitAI also classified the visuals in the viral clip as highly likely to be AI-generated.

Conclusion
CyberPeace ’s research confirms that the viral video claiming to show snakes raining from the sky is not authentic. The footage has been created using artificial intelligence and does not depict a real event.

Executive Summary:
A video of Prime Minister Narendra Modi is going viral across multiple social media platforms. In the clip, PM Modi is purportedly heard praising Christianity and stating that only Jesus Christ can lead people to heaven.Several users are sharing and commenting on the video, believing it to be genuine. The CyberPeace researched the viral claim and found it to be false. The circulating video has been created using artificial intelligence (AI).
Claim:
On January 29, 2026, a Facebook user named ‘Khaju Damor’ posted the viral video of PM Modi. The post gained traction, with many users sharing and commenting on it as if it were authentic. (Links and archived versions provided)

Fact Check:
As part of our research , we first closely examined the viral video. Upon careful observation, several inconsistencies were noticed. The Prime Minister’s facial expressions and hand movements appeared unnatural. The lip-sync and overall visual presentation also raised suspicions about the clip being digitally manipulated. To verify this further, we analyzed the video using the AI detection tool Hive Moderation. The tool’s analysis indicated a 99% probability that the video was AI-generated.

To independently confirm the findings, we also ran the clip through another detection platform, Undetectable.ai. Its analysis likewise indicated a very high likelihood that the video was created using artificial intelligence.

Conclusion:
Our research confirms that the viral video of Prime Minister Narendra Modi praising Christianity and making the alleged statement about heaven is fake. The clip has been generated using AI tools and does not depict a real statement made by the Prime Minister.

Executive Summary
A video is being shared on social media claiming to show an avalanche in Kashmir. The caption of the post alleges that the incident occurred on February 6. Several users sharing the video are also urging people to avoid unnecessary travel to hilly regions. CyberPeace’s research found that the video being shared as footage of a Kashmir avalanche is not real. The research revealed that the viral video is AI-generated.
Claim
The video is circulating widely on social media platforms, particularly Instagram, with users claiming it shows an avalanche in Kashmir on February 6. The archived version of the post can be accessed here. Similar posts were also found online. (Links and archived links provided)

Fact Check:
To verify the claim, we searched relevant keywords on Google. During this process, we found a video posted on the official Instagram account of the BBC. The BBC post reported that an avalanche occurred near a resort in Sonamarg, Kashmir, on January 27. However, the BBC post does not contain the viral video that is being shared on social media, indicating that the circulating clip is unrelated to the real incident.

A close examination of the viral video revealed several inconsistencies. For instance, during the alleged avalanche, people present at the site are not seen panicking, running for cover, or moving toward safer locations. Additionally, the movement and flow of the falling snow appear unnatural. Such visual anomalies are commonly observed in videos generated using artificial intelligence. As part of the research , the video was analyzed using the AI detection tool Hive Moderation. The tool indicated a 99.9% probability that the video was AI-generated.

Conclusion
Based on the evidence gathered during our research , it is clear that the video being shared as footage of a Kashmir avalanche is not genuine. The clip is AI-generated and misleading. The viral claim is therefore false.

Executive Summary
A news graphic is being shared on social media claiming that Uttar Pradesh Chief Minister Yogi Adityanath said,“Those who practice casteism and discrimination are the ones opposing UGC. If you do not indulge in caste-based discrimination, what is there to fear?” The CyberPeace’s research found the viral claim circulating on social media to be false. Our research revealed that Chief Minister Yogi Adityanath never made such a statement. It was also established that the viral news graphic has been digitally edited.
Claim
On February 8, a user on social media platform X (formerly Twitter) shared a news graphic bearing the logo of Navbharat Times, attributing the above statement to CM Yogi Adityanath. The post and its archived version can be seen below, along with screenshots. (Links and screenshots provided)

Fact Check:
To verify the authenticity of the claim, we conducted a keyword-based search on Google. However, we did not find any credible or reliable media report supporting the viral statement. We further examined the official social media accounts of Chief Minister Yogi Adityanath, including his Facebook and Instagram handles. Our review found no post, speech, or statement resembling the claim made in the viral graphic.
Continuing the research , we examined the official social media accounts of Navbharat Times. During this process, we found the original graphic published on the Navbharat Times Facebook page on January 26, 2026. The caption of the original graphic read: “On the occasion of Republic Day 2026, Uttar Pradesh Chief Minister Yogi Adityanath said, ‘No one is above the Constitution.’”
This clearly differs from the claim made in the viral graphic, indicating that the latter was altered.

Conclusion
Our research confirms that Uttar Pradesh Chief Minister Yogi Adityanath did not make the statement being attributed to him on social media. The viral news graphic is digitally edited and misleading. The claim, therefore, is false.

Executive Summary
A digitally manipulated image of World Bank President Ajay Banga has been circulating on social media, falsely portraying him as holding a Khalistani flag. The image was shared by a Pakistan-based X (formerly Twitter) user, who also incorrectly identified Banga as the President of the International Monetary Fund (IMF), thereby fuelling misleading speculation that he supports the Khalistani movement against India.
The Claim
On February 5, an X user with the handle @syedAnas0101010 posted an image allegedly showing Ajay Banga holding a Khalistani flag. The user misidentified him as the IMF President and captioned the post, “IMF president sending signals to INDIA.” The post quickly gained traction, amplifying false narratives and political speculation. Here is the link and archive link to the post, along with a screenshot:
Fact Check:
To verify the authenticity of the image, the CyberPeace Fact Check Desk conducted a detailed research . The image was first subjected to a reverse image search using Google Lens, which led to a Reuters news report published on June 13, 2023. The original photograph, captured by Reuters photojournalist Jonathan Ernst, showed Ajay Banga arriving at the World Bank headquarters in Washington, D.C., on June 2, 2023, marking his first day in office. In the authentic image, Banga is seen holding a coffee cup, not a flag.
Further analysis confirmed that the viral image had been digitally altered to replace the coffee cup with a Khalistani flag, thereby misrepresenting the context and intent of the original photograph. Here is the link to the report, along with a screenshot.

To strengthen the findings, the altered image was also analysed using the Hive Moderation AI detection tool. The tool’s assessment indicated a high likelihood that the image contained AI-generated or manipulated elements, reinforcing the conclusion that the image was not genuine. Below is a screenshot of the result.

Conclusion
The viral image claiming to show World Bank President Ajay Banga holding a Khalistani flag is fake. The photograph was digitally manipulated to spread misinformation and provoke political speculation. In reality, the original Reuters image from June 2023 shows Banga holding a coffee cup during his arrival at the World Bank headquarters. The claim that he supports the Khalistani movement is false and misleading.
.webp)
Executive Summary:
A video is being shared on social media showing a man running rapidly in a river with water bottles tied to both his feet. Users are circulating the video claiming that the man is attempting to run on water using the support of the bottles. CyberPeace’s research found the viral claim to be false. Our research revealed that the video being shared on social media is not real but has been generated using artificial intelligence (AI).
Claim :
The claim was shared by a Facebook user on February 5, 2026, who wrote that a man was running on water using water bottles tied to his feet, calling it a unique attempt and questioning whether humans can run on water. Links to the post, its archived version, and screenshots are provided below.

Fact Check:
To verify the claim, we searched relevant keywords on Google but did not find any credible media reports supporting the incident. A closer examination of the viral video revealed several visual irregularities, raising suspicion that it may have been AI-generated. The video was then scanned using the AI detection tool Hive Moderation. According to the tool’s results, the video is 99 percent likely to be AI-generated.

Conclusion:
Our research confirms that the viral video does not depict a real incident and has been falsely shared as a genuine attempt to run on water.