#FactCheck

A video circulating widely on social media claims that Defence Minister Rajnath Singh compared the Rashtriya Swayamsevak Sangh (RSS) with the Afghan Taliban. The clip allegedly shows Singh stating that both organisations share a common ideology and belief system and therefore “must walk together.” However, a research by the CyberPeace found that the video is digitally manipulated, and the audio attributed to Rajnath Singh has been fabricated using artificial intelligence.
Claim
An X user, Aamir Ali Khan (@Aamir_Aali), on January 20 shared a video of Defence Minister Rajnath Singh, claiming that he drew parallels between the Rashtriya Swayamsevak Sangh (RSS) and the Afghan Taliban. The user alleged that Singh stated both organisations follow a similar ideology and belief system and therefore must “walk together.” The post further quoted Singh as allegedly saying: “Indian RSS & Afghan Taliban have one ideology, we have one faith, we have one alliance, our mutual enemy is Pakistan. Israel is a strategic partner of India & Afghan Taliban are Israeli friends. We must join hands to destroy the enemy Pakistan.” Here is the link and archive link to the post, along with a screenshot.

Fact Check:
To verify the claim, the CyberPeace conducted a Google Lens search using keyframes extracted from the viral video. This search led to an extended version of the same footage uploaded on the official YouTube channel of Rajnath Singh. The original video was traced back to the inaugural ceremony of the Medium Calibre Ammunition Facility, constructed by Solar Industries in Nagpur. Upon reviewing the complete, unedited speech, the Desk found no instance where Rajnath Singh made any remarks comparing the RSS with the Afghan Taliban or spoke about shared ideology, alliances, or Pakistan in the manner claimed.
In the authentic footage, the Defence Minister spoke about:
" India’s push for Aatmanirbharta (self-reliance) in defence manufacturing
Strengthening domestic ammunition production
Positioning India as a global hub for defence exports "
The statements attributed to him in the viral clip were entirely absent from the original speech.
Here is the link to the original video, along with a screenshot.

In the next stage of the research , the audio track from the viral video was extracted and analysed using the AI voice detection tool Aurigin. This confirmed that the original visuals were misused and overlaid with a synthetic voice track to create a misleading narrative.

Conclusion
The CyberPeace concluded that the viral video claiming Defence Minister Rajnath Singh compared the RSS with the Afghan Taliban is false and misleading. The video has been digitally manipulated, with an AI-generated audio track falsely attributed to Singh. The Defence Minister made no such remarks during the Nagpur event, and the claim circulating online is fabricated.

A video of Bollywood actor Salman Khan is being widely circulated on social media, in which he can allegedly be heard saying that he will soon join Asaduddin Owaisi’s party, the All India Majlis-e-Ittehadul Muslimeen (AIMIM). Along with the video, a purported image of Salman Khan with Asaduddin Owaisi is also being shared. Social media users are claiming that Salman Khan is set to join the AIMIM party.
CyberPeace research found the viral claim to be false. Our research revealed that Salman Khan has not made any such statement, and that both the viral video and the accompanying image are AI-generated.
Claim
Social media users claim that Salman Khan has announced his decision to join AIMIM.On 19 January 2026, a Facebook user shared the viral video with the caption, “What did Salman say about Owaisi?” In the video, Salman Khan can allegedly be heard saying that he is going to join Owaisi’s party. (The link to the post, its archived version, and screenshots are available.)

Fact Check:
To verify the claim, we first searched Google using relevant keywords. However, no credible or reliable media reports were found supporting the claim that Salman Khan is joining AIMIM.

In the next step of verification, we extracted key frames from the viral video and conducted a reverse image search using Google Lens. This led us to a video posted on Salman Khan’s official Instagram account on 21 April 2023. In the original video, Salman Khan is seen talking about an event scheduled to take place in Dubai. A careful review of the full video confirmed that no statement related to AIMIM or Asaduddin Owaisi is made.

Further analysis of the viral clip revealed that Salman Khan’s voice sounds unnatural and robotic. To verify this, we scanned the video using AURGIN AI, an AI-generated content detection tool. According to the tool’s analysis, the viral video was generated using artificial intelligence.

Conclusion
Salman Khan has not announced that he is joining the AIMIM party. The viral video and the image circulating on social media are AI-generated and manipulated.

A news graphic bearing the Navbharat Times logo is being widely circulated on social media. The graphic claims that religious preacher Devkinandan Thakur made an extremely offensive and casteist remark targeting the ‘Shudra’ community. Social media users are sharing the graphic and claiming that the statement was actually made by Devkinandan Thakur. Cyber Peace Foundation’s research and verification found that the claim being shared online is misleading. Our research found that the viral news graphic is completely fake and that Devkinandan Thakur did not make any such casteist statement.
Claim
A viral news graphic claims that Devkinandan Thakur made a derogatory and caste-based statement about Shudras.On 17 January 2026, an Instagram user shared the viral graphic with the caption, “This is probably the formula of Ram Rajya.”The text on the graphic reads: “People of Shudra castes reproduce through sexual intercourse, whereas Brahmins give birth to children after marriage through the power of their mantras, without intercourse.” The graphic also carries Devkinandan Thakur’s photograph and identifies him as a ‘Kathavachak’ (religious storyteller).

Fact Check:
To verify the claim, we first searched for relevant keywords on Google. However, no credible or verified media reports were found supporting the claim. In the next stage of verification, we found a post published by NBT Hindi News (Navbharat Times) on X (formerly Twitter) on 17 January 2026, in which the organisation explicitly debunked the viral graphic. Navbharat Times clarified that the graphic circulating online was fake and also shared the original and authentic post related to the news.

Further research led us to Devkinandan Thakur’s official Facebook account, where he posted a clarification on 17 January 2026. In his post, he stated that anti-social elements are creating fake ‘Sanatani’ profiles and spreading false news, misusing the names of reputed media houses and platforms to mislead and divide people. He described the viral content as part of a deliberate conspiracy and fake agenda aimed at weakening unity. He also warned that AI-generated fake videos and fabricated statements are increasingly being used to create confusion, mistrust and division.
Devkinandan Thakur urged people not to believe or share any post, news or video without verification, and advised checking information through official websites, verified social media accounts or trusted sources.

Conclusion
The viral news graphic attributing a casteist statement to Devkinandan Thakur is completely fake.Devkinandan Thakur did not make the alleged remark, and the graphic circulating with the Navbharat Times logo is fabricated.

An image showing a damaged statue of Mahatma Gandhi, broken into two pieces, is being widely shared on social media. The image shows Gandhi’s statue with its head separated from the body, prompting strong reactions online.
Social media users are claiming that the incident occurred in Bangladesh, alleging that Mahatma Gandhi’s statue was deliberately vandalised there. The image is being described as a recent incident and is being circulated across platforms with provocative and inflammatory captions.
Cyber Peace Foundation’s research and verification found that the claim being shared online is misleading. Our rsearch revealed that the viral image is not from Bangladesh. The image is actually from Chakulia in Uttar Dinajpur district of West Bengal, India
Claim:
Social media users claim that Mahatma Gandhi’s statue was vandalised in Bangladesh, and that the viral image shows a recent incident from the country.One Facebook user shared the video on 19 January 2026, making derogatory remarks and falsely linking the incident to Bangladesh. The post has since been widely shared on social media platforms. (Archived links and screenshots are available.)

Fact Check:
Our research revealed that the viral image is not from Bangladesh. The image is actually from Chakulia in Uttar Dinajpur district of West Bengal, India. To verify the claim, we conducted a reverse image search using Google Lens on key frames from the viral video. This led us to a report published by ABP Live Bangla on 16 January 2026, which featured the same visuals. Link and screenshot

According to ABP Live Bangla, the statue of Mahatma Gandhi was damaged during a protest in Chakulia. The statue’s head was found separated from the body. While a portion of the broken statue remained at the site on Thursday night, it was reported missing by Friday morning. The report further stated that extensive damage was observed at BDO Office No. 2 in Golpokhar. Gandhi’s statue, located at the entrance of the administrative building, was found broken, and ashes were discovered near the premises. Government staff were seen clearing scattered debris from the site.
The incident reportedly occurred during a SIR (Special Intensive Revision) hearing at the BDO office, which was disrupted due to vandalism. In connection with the violence and damage to government property, 21 people have been arrested so far. In the next stage of verification, we found the same footage in a 16 January 2026 report by local Bengali news channel K TV, which also showed clear visuals of the damaged Mahatma Gandhi statue. Link and screenshot.

Conclusion:
The viral image of Mahatma Gandhi’s broken statue does not depict an incident from Bangladesh. The image is from Chakulia in West Bengal’s Uttar Dinajpur district, where the statue was damaged during a protest.

A photo featuring Bollywood actor Abhishek Bachchan and actress Aishwarya Rai is being widely shared on social media. In the image, the Kedarnath Temple is clearly visible in the background. Users are claiming that the couple recently visited the Kedarnath shrine for darshan.
Cyber Peace Foundation’s research found the viral claim to be false. Our research revealed that the image of Abhishek Bachchan and Aishwarya Rai is not real, but AI-generated, and is being misleadingly shared as a genuine photograph.
Claim
On January 14, 2026, a user on X (formerly Twitter) shared the viral image with a caption suggesting that all rumours had ended and that the couple had restarted their life together. The post further claimed that both actors were seen smiling after a long time, implying that the image was taken during their visit to Kedarnath Temple.
The post has since been widely circulated on social media platforms

Fact Check:
To verify the claim, we first conducted a keyword search on Google related to Abhishek Bachchan, Aishwarya Rai, and a Kedarnath visit. However, we did not find any credible media reports confirming such a visit.
On closely examining the viral image, several visual inconsistencies raised suspicion about it being artificially generated. To confirm this, we scanned the image using the AI detection tool Sightengine. According to the tool’s analysis, the image was found to be 84 percent AI-generated.

Additionally, we scanned the same image using another AI detection tool, HIVE Moderation. The results showed an even stronger indication, classifying the image as 99 percent AI-generated.

Conclusion
Our research confirms that the viral image showing Abhishek Bachchan and Aishwarya Rai at Kedarnath Temple is not authentic. The picture is AI-generated and is being falsely shared on social media to mislead users.

A video circulating widely on social media shows a child throwing stones at a moving train, while a few other children can also be seen climbing onto the engine. The video is being shared with a communal narrative, with claims that the incident took place in India.
Cyber Peace Foundation’s research found the viral claim to be misleading. Our research revealed that the video is not from India, but from Bangladesh, and is being falsely linked to India on social media.
Claim:
On January 15, 2026, a Facebook user shared the viral video claiming it depicted an incident from India. The post carried a provocative caption stating, “We are not afraid of Pakistan outside our borders. We are afraid of the thousands of mini-Pakistans within India.” The post has been widely circulated, amplifying communal sentiments.

Fact Check:
To verify the authenticity of the video, we conducted a reverse image search using Google Lens by extracting keyframes from the viral clip. During this process, we found the same video uploaded on a Bangladeshi Facebook account named AL Amin Babukhali on December 28, 2025. The caption of the original post mentions Kamalapur, which is a well-known railway station in Bangladesh. This strongly indicates that the incident did not occur in India.

Further analysis of the video shows that the train engine carries the marking “BR”, along with text written in the Bengali language. “BR” stands for Bangladesh Railways, confirming the origin of the train. To corroborate this further, we searched for images related to Bangladesh Railways using Google’s open tools. We found multiple images on Getty Images showing train engines with the same design and markings as seen in the viral video. The visual match clearly establishes that the train belongs to Bangladesh Railways.

Conclusion
Our research confirms that the viral video is from Bangladesh, not India. It is being shared on social media with a false and misleading claim to give it a communal angle and link it to India.

A video is being widely shared on social media showing devotees seated in a boat appearing stunned as a massive, multi-hooded snake—resembling the mythical Sheshnag—suddenly emerges from the middle of a water body.
The video captures visible panic and astonishment among the devotees. Social media users are sharing the clip claiming that it is from Vrindavan, with some portraying the sight as a divine or supernatural event. However, research conducted by the Cyber Peace Foundation found the viral claim to be false. Our research revealed that the video is not authentic and has been generated using artificial intelligence (AI).
Claim
On January 17, 2026, a user shared the viral video on Instagram with the caption suggesting that God had appeared again in the age of Kalyug. The post claims that a terrifying video from Vrindavan has surfaced in which devotees sitting in a boat were shocked to see a massive multi-hooded snake emerge from the water. The caption further states that devotees are hailing the creature as an incarnation of Sheshnag or Vasuki Nag, raising religious slogans and questioning whether the sight represents a divine sign. (The link to the post, its archive link, and screenshots are available.)
- https://www.instagram.com/reel/DTngN9FkoX0/?igsh=MTZvdTN1enI2NnFydA%3D%3D
- https://archive.ph/UuAqB
Fact Check:
Upon closely examining the viral video, we suspected that it might be AI-generated. To verify this, the video was scanned using the AI detection tool SIGHTENGINE, which indicated that the visual is 99 per cent AI-generated.

In the next step of the research , the video was analysed using another AI detection tool, HIVE Moderation. According to the results obtained, the video was found to be 62 per cent AI-generated.

Conclusion
Our research clearly establishes that the viral video claiming to show a multi-hooded snake in Vrindavan is not real. The clip has been created using artificial intelligence and is being falsely shared on social media with religious and sensational claims.

Assembly elections are due to be held in Assam later this year, with polling likely in April or May. Ahead of the elections, a video claiming to be an Aaj Tak news bulletin is being widely circulated on social media.
In the viral video, Aaj Tak anchor Rajiv Dhoundiyal is allegedly seen stating that a leaked intelligence report has issued a warning for the ruling Bharatiya Janata Party (BJP) in Assam. The clip claims that according to this purported report, the BJP may suffer significant losses in the upcoming Assembly elections. Several social media users sharing the video have also claimed that the alleged intelligence report signals the possible removal of Assam Chief Minister Himanta Biswa Sarma from office.
However, an investigation by the Cyber Peace Foundation found the viral claim to be false. Our probe clearly established that no leaked intelligence report related to the Assam Assembly elections exists.
Further, Aaj Tak has neither published nor broadcast any such report on its official television channel, website, or social media platforms. The investigation also revealed that the viral video itself is not authentic and has been created using deepfake technology.
Claim
On social media platform Facebook, a user shared the viral video claiming that the BJP has been pushed on the back foot following organisational changes in the Congress—appointing Priyanka Gandhi Vadra as chairperson of the election screening committee and Gaurav Gogoi as the Assam Pradesh Congress Committee president. The post further claims that an Intelligence Bureau report predicts that the current Assam government will not return to power.
(Link to the post, archive link, and screenshots are available.)
FactCheck:
To verify the claim, we first searched for reports related to any alleged leaked intelligence assessment concerning the Assam Assembly elections using relevant keywords. However, no credible or reliable reports supporting the claim were found. We then reviewed Aaj Tak’s official website, social media pages, and YouTube channel. Our examination confirmed that no such news bulletin has been published or broadcast by the network on any of its official platforms.
- https://www.facebook.com/aajtak/?locale=hi_IN
- https://www.instagram.com/aajtak/
- https://x.com/aajtak
- https://www.youtube.com/channel/UCt4t-jeY85JegMlZ-E5UWtA
To further verify the authenticity of the video, its audio was scanned using the deepfake voice detection tool HIVE Moderation.
The analysis revealed that the voice heard in the video is 99 per cent AI-generated, clearly indicating that the audio is not genuine and has been artificially created using artificial intelligence.

Additionally, the video was analysed using another AI detection tool, Aurigin AI, which also identified the viral clip as AI-generated.

Conclusion:
The investigation clearly establishes that there is no leaked intelligence report predicting BJP’s defeat in the Assam Assembly elections. Aaj Tak has not published or broadcast any such content on its official platforms. The video circulating on social media is not authentic and has been created using deepfake technology to mislead viewers.

A photograph showing a massive crowd on a road is being widely shared on social media. The image is being circulated with the claim that people in the United States are staging large-scale protests against President Donald Trump.
However, CyberPeace Foundation’s research has found this claim to be misleading. Our fact-check reveals that the viral photograph is nearly eight years old and has been falsely linked to recent political developments.
Claim:
Social media users are sharing a photograph and claiming that it shows people protesting against US President Donald Trump.An X (formerly Twitter) user, Salman Khan Gauri (@khansalman88177), shared the image with the caption:“Today, a massive protest is taking place in America against Donald Trump.”
The post can be viewed here, and its archived version is available here.

FactCheck:
To verify the claim, we conducted a reverse image search of the viral photograph using Google. This led us to a report published by The Mercury News on April 6, 2018.
The report features the same image and states that the photograph was taken on March 24, 2018, during the ‘March for Our Lives’ rally in Washington, DC. The rally was organized to demand stricter gun control laws in the United States. The image shows a large crowd gathered on Pennsylvania Avenue in support of gun reform.
The report further notes that the Associated Press, on March 30, 2018, debunked false claims circulating online which alleged that liberal billionaire George Soros and his organizations had paid protesters $300 each to participate in the rally.

Further research led us to a report published by The Hindu on March 25, 2018, which also carries the same photograph. According to the report, thousands of Americans across the country participated in ‘March for Our Lives’ rallies following a mass shooting at a school in Florida. The protests were led by survivors and victims, demanding stronger gun laws.
The objective of these demonstrations was to break the legislative deadlock that has long hindered efforts to tighten firearm regulations in a country frequently rocked by mass shootings in schools and colleges.

Conclusion
The viral photograph is nearly eight years old and is unrelated to any recent protests against President Donald Trump.The image actually depicts a gun control protest held in 2018 and is being falsely shared with a misleading political claim.By circulating this outdated image with an incorrect context, social media users are spreading misinformation.

Social media users are widely sharing a video claiming to show an aircraft carrier being destroyed after getting trapped in a massive sea storm. In the viral clip, the aircraft carrier can be seen breaking apart amid violent waves, with users describing the visuals as a “wrath of nature.”
However, CyberPeace Foundation’s research has found this claim to be false. Our fact-check confirms that the viral video does not depict a real incident and has instead been created using Artificial Intelligence (AI).
Claim:
An X (formerly Twitter) user shared the viral video with the caption,“Nature’s wrath captured on camera.”The video shows an aircraft carrier appearing to be devastated by a powerful ocean storm. The post can be viewed here, and its archived version is available here.
https://x.com/Maailah1712/status/2011672435255624090

Fact Check:
At first glance, the visuals shown in the viral video appear highly unrealistic and cinematic, raising suspicion about their authenticity. The exaggerated motion of waves, structural damage to the vessel, and overall animation-like quality suggest that the video may have been digitally generated. To verify this, we analyzed the video using AI detection tools.
The analysis conducted by Hive Moderation, a widely used AI content detection platform, indicates that the video is highly likely to be AI-generated. According to Hive’s assessment, there is nearly a 90 percent probability that the visual content in the video was created using AI.

Conclusion
The viral video claiming to show an aircraft carrier being destroyed in a sea storm is not related to any real incident.It is a computer-generated, AI-created video that is being falsely shared online as a real natural disaster. By circulating such fabricated visuals without verification, social media users are contributing to the spread of misinformation.

Executive Summary:
A video circulating on social media claims to show a live elephant falling from a moving truck due to improper transportation, followed by the animal quickly standing up and reacting on a public road. The content may raise concerns related to animal cruelty, public safety, and improper transport practices. A detailed examination using AI content detection tools, visual anomaly analysis indicates that the video is not authentic and is likely AI generated or digitally manipulated.
Claim:
The viral video (archive link) shows a disturbing scene where a large elephant is allegedly being transported in an open blue truck with barriers for support. As the truck moves along the road, the elephant shifts its weight and the weak side barrier breaks. This causes the elephant to fall onto the road, where it lands heavily on its side. Shortly after, the animal is seen getting back on its feet and reacting in distress, facing the vehicle that is recording the incident. The footage may raise serious concerns about safety, as elephants are normally transported in reinforced containers, and such an incident on a public road could endanger both the animal and people nearby.

Fact Check:
After receiving the video, we closely examined the visuals and noticed some inconsistencies that raised doubts about its authenticity. In particular, the elephant is seen recovering and standing up unnaturally quickly after a severe fall, which does not align with realistic animal behavior or physical response to such impact.
To further verify our observations, the video was analyzed using the Hive Moderation AI Detection tool, which indicated that the content is likely AI generated or digitally manipulated.

Additionally, no credible news reports or official sources were found to corroborate the incident, reinforcing the conclusion that the video is misleading.
Conclusion:
The claim that the video shows a real elephant transport accident is false and misleading. Based on AI detection results, observable visual anomalies, and the absence of credible reporting, the video is highly likely to be AI generated or digitally manipulated. Viewers are advised to exercise caution and verify such sensational content through trusted and authoritative sources before sharing.
- Claim: The viral video shows an elephant allegedly being transported, where a barrier breaks as it moves, causing the animal to fall onto the road before quickly getting back on its feet.
- Claimed On: X (Formally Twitter)
- Fact Check: False and Misleading

Executive Summary:
A viral video circulating on social media shows a man attempting to fly using a helicopter like fan attached to his body, followed by a crash onto a parked car. The clip was widely shared with humorous captions, suggesting it depicts a real life incident. Given the unusual nature of the visuals, the video was subjected to technical verification using AI content detection tools. Analysis using the AI detection platform indicates that the video is AI generated, and not a genuine real world event.
Claim:
A viral video (archive link) claims to show a person attempting to fly using a self made helicopter fan mechanism, briefly lifting off before crashing onto a car in a public setting. The video shows a man attempting to fly by strapping a helicopter like rotating fan to himself, essentially trying to imitate a human helicopter using a DIY mechanism. For a brief moment, it appears as if the device might work, but the attempt quickly fails due to lack of control, engineering support, and safety measures. Within seconds, the man loses balance and crashes down, landing on top of a parked car. The scene highlights a mix of overconfidence, unregulated experimentation, and risk taking carried out in a public space, with bystanders watching rather than intervening. The clip is shared humorously with the caption “India is not for beginners”.

Fact Check:
To verify the authenticity of the video, it was analyzed using the Hive Moderation AI detection tool, a widely used platform for identifying synthetic and AI generated media. The tool flagged the video with a high probability of AI generation, indicating that the visuals are not captured from a real physical event. Additional indicators such as unrealistic motion physics, inconsistent human object interaction further support the conclusion that the clip was artificially generated or heavily manipulated using generative AI techniques. No credible news reports or independent eyewitness sources corroborate the occurrence of such an incident.

Conclusion:
The claim that the video shows an individual attempting and failing to fly using a helicopter like device is false. Technical analysis confirms that the video is AI generated, and it should be treated as synthetic or fictional content rather than a real life incident. This case highlights how AI generated videos, when shared without context, can mislead audiences and be mistaken for real events, reinforcing the need for verification tools and critical evaluation of viral content.
- Claim: A viral video claims to show a person attempting to fly using a self made helicopter fan mechanism, briefly lifting off before crashing onto a car in a public setting
- Claimed On: X (Formally Twitter)
- Fact Check: False and Misleading