Navigating the Path to CyberPeace: Insights and Strategies
Featured #factCheck Blogs

Executive Summary
A video is going viral on social media showing a massive building engulfed in flames and collapsing into debris. It is being widely claimed that Iran launched a powerful attack that destroyed Israel’s army headquarters. However, research by CyberPeace reveals that this claim is misleading. The viral video is AI-generated and has no connection to any real-world event.
Claim
An X (formerly Twitter) user shared the viral video with the caption: “Iran has targeted Israel’s army headquarters. It seems Israel’s dream of becoming ‘Greater Israel’ will remain unfulfilled.”
Post link:
- https://x.com/KAMESHKUMAR96/status/2039009484069368083
Archived version:
- https://archive.ph/HKXkK
- https://x.com/KAMESHKUMAR96/status/2039009484069368083
- https://archive.ph/HKXkK

Similar videos have also been shared by other users on social media:
Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search. During this process, we found several credible media reports confirming that Iran has carried out drone and missile attacks on Israel and the Gulf regions in recent times. However, none of these reports featured the viral video, indicating that it is not authentic footage.

- https://www.youtube.com/watch?v=fxDBX90bYng

A closer examination of the video revealed multiple visual inconsistencies commonly associated with AI-generated content. For instance, a building on the left side appears to bend and collapse in a rubber-like manner—something that is physically unrealistic for structures made of concrete and steel. Additionally, the smoke and flames appear unnatural and lack realistic dynamics.
To further verify, we analyzed the video using the AI detection tool Hive Moderation, which classified it as 99.9% AI-generated.

We also tested the video using the Deepfake-o-Meter platform.The AVSRDD (2025) model detected it as 99.5% AI-generated

Conclusion
Our research clearly establishes that the viral video claiming Iran destroyed Israel’s army headquarters is false and misleading. The footage does not appear in any credible news coverage of recent attacks, which strongly indicates that it is not real. Moreover, multiple AI detection tools consistently classify the video as artificially generated, with extremely high probability scores. Visual anomalies in the clip further support this finding.

Executive Summary:
A video is rapidly circulating on social media following claims that Iran’s national security chief Ali Larijani was killed in an Israeli airstrike. The viral clip is being shared with the assertion that it shows the moment Israel launched a powerful attack on Iran to eliminate Larijani, allegedly shaking the ground due to the intensity of the strike However, research by CyberPeace has found the claim to be misleading. The viral video is AI-generated and has no connection to real-world events.
Claim:
Social media users have shared the video with alarming captions. One such post by Deepak Sharma reads:
“WAR UPDATE… Iran is in its final phase… Israel is striking selectively… This attack will leave you shocked… Iran’s national security chief Ali Larijani has been killed in this attack… The intensity of the strike shook the Iranian الأرض.
Post links:

Similar videos were also shared by other users:
- urabh_raj3026/status/2038834832869032026
- https://x.com/ibmindia20/status/2038938020154597447
- https://x.com/Saurabh_raj3026/status/2038834832869032026
Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search. During this process, we found the same video on Instagram, uploaded on March 9, 2026, by the account “_iranwar_2026” without any descriptive caption.

According to a BBC report, Ali Larijani died on March 17 in an Israeli strike. This establishes that the viral video predates the reported incident, making the claim factually inconsistent. Further examination of the Instagram account revealed that it frequently shares pro-Iran content, including gaming visuals and AI-generated clips, raising doubts about the authenticity of the video.

To strengthen the verification, we analyzed the viral clip using the AI detection tool “Zhuque AI Detection Assistant.” The results indicated a 91.71% probability that the video is AI-generated, confirming that it is not real footage.

Conclusion
The viral claim linking the video to an Israeli airstrike that allegedly killed Ali Larijani is misleading and factually incorrect. Multiple layers of verification show that the video existed online before the reported incident, ruling out any direct connection. Additionally, AI detection analysis strongly suggests that the video is artificially generated. The source account’s pattern of sharing AI and gaming-related content further weakens the credibility of the claim. There is no verified evidence to support that the viral clip depicts a real attack or any event related to Larijani’s death. Instead, the video appears to be a digitally created visual circulated without context to amplify misinformation.

Executive Summary
A digitally manipulated image of World Bank President Ajay Banga has been circulating on social media, falsely portraying him as holding a Khalistani flag. The image was shared by a Pakistan-based X (formerly Twitter) user, who also incorrectly identified Banga as the President of the International Monetary Fund (IMF), thereby fuelling misleading speculation that he supports the Khalistani movement against India.
The Claim
On February 5, an X user with the handle @syedAnas0101010 posted an image allegedly showing Ajay Banga holding a Khalistani flag. The user misidentified him as the IMF President and captioned the post, “IMF president sending signals to INDIA.” The post quickly gained traction, amplifying false narratives and political speculation. Here is the link and archive link to the post, along with a screenshot:
Fact Check:
To verify the authenticity of the image, the CyberPeace Fact Check Desk conducted a detailed research . The image was first subjected to a reverse image search using Google Lens, which led to a Reuters news report published on June 13, 2023. The original photograph, captured by Reuters photojournalist Jonathan Ernst, showed Ajay Banga arriving at the World Bank headquarters in Washington, D.C., on June 2, 2023, marking his first day in office. In the authentic image, Banga is seen holding a coffee cup, not a flag.
Further analysis confirmed that the viral image had been digitally altered to replace the coffee cup with a Khalistani flag, thereby misrepresenting the context and intent of the original photograph. Here is the link to the report, along with a screenshot.

To strengthen the findings, the altered image was also analysed using the Hive Moderation AI detection tool. The tool’s assessment indicated a high likelihood that the image contained AI-generated or manipulated elements, reinforcing the conclusion that the image was not genuine. Below is a screenshot of the result.

Conclusion
The viral image claiming to show World Bank President Ajay Banga holding a Khalistani flag is fake. The photograph was digitally manipulated to spread misinformation and provoke political speculation. In reality, the original Reuters image from June 2023 shows Banga holding a coffee cup during his arrival at the World Bank headquarters. The claim that he supports the Khalistani movement is false and misleading.
.webp)
Executive Summary:
A video is being shared on social media showing a man running rapidly in a river with water bottles tied to both his feet. Users are circulating the video claiming that the man is attempting to run on water using the support of the bottles. CyberPeace’s research found the viral claim to be false. Our research revealed that the video being shared on social media is not real but has been generated using artificial intelligence (AI).
Claim :
The claim was shared by a Facebook user on February 5, 2026, who wrote that a man was running on water using water bottles tied to his feet, calling it a unique attempt and questioning whether humans can run on water. Links to the post, its archived version, and screenshots are provided below.

Fact Check:
To verify the claim, we searched relevant keywords on Google but did not find any credible media reports supporting the incident. A closer examination of the viral video revealed several visual irregularities, raising suspicion that it may have been AI-generated. The video was then scanned using the AI detection tool Hive Moderation. According to the tool’s results, the video is 99 percent likely to be AI-generated.

Conclusion:
Our research confirms that the viral video does not depict a real incident and has been falsely shared as a genuine attempt to run on water.

Executive Summary
A video showing thick smoke rising from a building and people running in panic is being shared on social media. The video is being circulated with the claim that it shows Iran launching a missile attack on the United States.CyberPeace’s research found the claim to be misleading. Our probe revealed that the video is not related to any recent incident. The viral clip is actually from the September 11, 2001 terrorist attacks on the World Trade Center in the United States and is being falsely shared as footage of an alleged Iranian missile strike on the US.
Claim:
An Instagram user shared the video claiming, “Iran has attacked a US airbase in Qatar. Iran has fired six ballistic missiles at the Al Udeid Airbase in Qatar. Al Udeid Airbase is the largest US military base in West Asia.”
Links to the post and its archived version are provided below.

Fact Check:
To verify the claim, we extracted key frames from the viral video and ran a reverse image search using Google Lens. During the search, we found visuals matching the viral clip in a report published by Wion on September 11, 2021. The report, titled “In pics | A look back at the scenes from the 9/11 attacks,” included an image that closely resembled the visuals seen in the viral video. The caption of the image stated that it was a file photo from September 11, 2001, showing pedestrians running as one of the World Trade Center towers collapsed in New York City.

Further research led us to the same footage on the YouTube channel CBS 8 San Diego. At the 01:11 timestamp of the video, visuals matching the viral clip can be clearly seen.

We also found an Al Jazeera report dated June 23, 2025, which confirmed that Iran had attacked US forces stationed at the Al Udeid airbase in Qatar in retaliation for US strikes on Iran’s uclear facilities. However, the visuals used in the viral video do not correspond to this incident.

Conclusion
The viral video does not show a recent Iranian attack on a US airbase in Qatar. The clip actually dates back to the September 11, 2001 terrorist attacks on the World Trade Center in the United States. Old 9/11 footage has been falsely shared with a misleading claim linking it to Iran’s alleged missile strike on the US.

Executive Summary:
A purported media release allegedly issued in the name of the International Cricket Council (ICC) is being widely circulated on social media. The release claims that the ICC has decided to impose a one-year ban on Pakistan cricket. CyberPeace’s research found this claim to be false.The research revealed that the media release circulating on social media is fake, and no such letter or official statement has been issued by the ICC.
Claim:
On social media platform X (formerly Twitter), a user shared the viral letter on February 3, 2026, claiming that an ICC meeting was held in which board members voted on issues related to Pakistan. The post alleged that 14 out of 16 votes were cast in favour of the BCCI. The user further claimed that Pakistan’s share of ICC revenue would be reduced and that Pakistan might be asked to compensate for losses incurred by the ICC.
The viral letter, written in English, stated that matters related to Pakistan were discussed in an ICC meeting and that a 14–2 majority vote led to the decision to impose a one-year ban on Pakistan cricket. It further claimed that the Pakistan Super League (PSL) would be suspended for one year, Pakistan’s annual revenue share would be reduced from 5.75 percent to 2.25 percent, and Pakistan would not be allowed to host any ICC tournaments until 2040. The letter also claimed that these decisions were taken to safeguard the integrity and spirit of the game. Links to the viral post, archive link, and screenshots can be seen below.

Fact Check:
To verify the viral claim, CyberPeace conducted a Google search using relevant keywords. However, no credible or reliable media reports supporting the claim were found. In the next step of the research , an official press release uploaded on DD Sports’ Facebook page on February 2, 2026, was found. The press release responded to Pakistan’s decision not to play against India in a Group A match. The DD Sports statement said that the Pakistan Cricket Board should consider the long-term and serious implications of such a decision, as it could impact the global cricket ecosystem—of which Pakistan is itself a member and beneficiary.

Notably, the official press release made no mention of any ban on Pakistan cricket, reduction in revenue share, suspension of the PSL, or restrictions on hosting ICC tournaments, contrary to the claims made in the viral letter. Further, the same official statement was found published on the ICC’s website on February 1, 2026. This release also did not mention any decision related to banning Pakistan cricket or barring the country from hosting ICC tournaments for the next 40 years.

Conclusion
CyberPeace concludes that the media release circulating on social media is fake. The ICC has not issued any official letter or statement announcing a one-year ban on Pakistan cricket, revenue cuts, or restrictions on hosting ICC tournaments.

Executive Summary
Mumbai’s Mira–Bhayandar bridge has recently been in the news due to its unusual design. In this context, a photograph is going viral on social media showing a bus seemingly stuck on the bridge. Some users are also sharing the image while claiming that it is from Sonpur subdivision in Bihar. However, an research by the CyberPeace has found that the viral image is not real. The bridge shown in the image is indeed the Mira–Bhayandar bridge, which is under discussion because its design causes it to suddenly narrow from four lanes to two lanes. That said, the bridge is not yet operational, and the viral image showing a bus stuck on it has been created using Artificial Intelligence (AI).
Claim
An Instagram user shared the viral image on January 29, 2026, with the caption:“Are Indian taxpayers happy to see that this is funded by their money?” The link, archive link, and screenshot of the post can be seen below.

Fact Check:
To verify the claim, we first conducted a Google Lens reverse image search. This led us to a post shared by X (formerly Twitter) user Manoj Arora on January 29. While the bridge structure in that image matches the viral photo, no bus is visible in the original post.This raised suspicion that the viral image had been digitally manipulated.

We then ran the viral image through the AI detection tool Hive Moderation, which flagged it as over 99% likely to be AI-generated

Conclusion
The CyberPeace research confirms that while the Mira–Bhayandar bridge is real and has been in the news due to its design, the viral image showing a bus stuck on the bridge has been created using AI tools. Therefore, the image circulating on social media is misleading.

Executive Summary
On January 22, an Indian Army vehicle met with an accident in Jammu and Kashmir’s Doda district, resulting in the death of 10 soldiers, while several others were injured. In connection with this tragic incident, a photograph is now going viral on social media. The viral image shows an Army vehicle that appears to have fallen into a deep gorge, with several soldiers visible around the site. Users sharing the image are claiming that it depicts the actual scene of the Doda accident.
However, an research by the CyberPeacehas found that the viral image is not genuine. The photograph has been generated using Artificial Intelligence (AI) and does not represent the real accident. Hence, the viral post is misleading.
Claim
An Instagram user shared the viral image on January 22, 2026, writing:“Deeply saddened by the tragic accident in Doda, Jammu & Kashmir today, in which 10 brave soldiers lost their lives. My heartfelt tribute to the martyrs who laid down their lives in the line of duty.Sincere condolences to the bereaved families, and prayers for the speedy recovery of the injured soldiers.The nation will forever remember your sacrifice.”
The link and screenshot of the post can be seen below.
- https://www.instagram.com/p/DT0UBIRk_3k/
- https://archive.ph/submit/?url=https%3A%2F%2Fwww.instagram.com%2Fp%2FDT0UBIRk_3k%2F+

Fact Check:
To verify the claim, we first closely examined the viral image. Several visual inconsistencies were observed. The structure of the soldier visible inside the damaged vehicle appears distorted, and the hands and limbs of people involved in the rescue operation look unnatural. These anomalies raised suspicion that the image might be AI-generated. Based on this, we ran the image through the AI detection tool Hive Moderation, which indicated that the image is over 99.9% likely to be AI-generated.

Another AI image detection tool, Sightengine, also flagged the image as 99% AI-generated.

During further research , we found a report published by Navbharat Times on January 22, 2026, which confirmed that an Indian Army vehicle had indeed fallen into a deep gorge in Doda district. According to officials, 10 soldiers were killed and 7 others were injured, and rescue operations were immediately launched.
However, it is important to note that the image circulating on social media is not an actual photograph from the incident.

Conclusion
CyberPeace research confirms that the viral image linked to the Doda Army vehicle accident has been created using Artificial Intelligence. It is not a real photograph from the incident, and therefore, the viral post is misleading.

Executive Summary
A video showing a group of Hindu ascetics (sadhus) allegedly performing intense penance while their bodies appear to be covered in ice is being widely shared on social media. Users are circulating the video as real and claiming that it represents an ancient tradition of Sanatan Dharma. CyberPeace research found the viral claim to be false.The research revealed that the video circulating on social media is not real but has been generated using artificial intelligence (AI).
Claim
On social media platform Facebook, a user shared the viral video on January 16, 2026. The video shows several ascetics engaged in penance, with their bodies seemingly covered in ice. Users shared the video while claiming that it depicts an authentic spiritual practice rooted in Sanatan Dharma.
Links to the post, archive link, and screenshots can be seen below.

Fact Check:
To verify the authenticity of the viral claim, CyberPeace searched relevant keywords on Google. However, no credible or reliable media reports supporting the claim were found. A close examination of the viral video raised suspicion that it may have been AI-generated. To verify this, the video was analysed using the AI detection tool Hive Moderation. According to the results, the video was found to be 99 percent AI-generated.

In the next step of the research, the same video was analysed using another AI detection tool, Sightengine. The results again indicated that the video was 99 percent AI-generated.

Conclusion
CyberPeace concludes that the video circulating on social media is not real. The viral video showing ascetics covered in ice was generated using artificial intelligence and does not depict an actual religious or spiritual practice.

Executive Summary:
A video showing poor runway visibility from inside an aircraft cockpit is being widely shared on social media, linking it to an alleged aircraft accident involving Maharashtra Deputy Chief Minister Ajit Pawar in Baramati on January 28, 2025. Users claim that the footage captured the final moments before the crash, suggesting that the runway visibility disappeared just seconds before landing. However, research conducted by the CyberPeace found the viral claim to be misleading. The research revealed that the video has no connection to any aircraft accident involving Deputy Chief Minister Ajit Pawar. In reality, the video dates back to 2013 and shows a pilot attempting to land an aircraft amid heavy rain. During the approach, the runway briefly disappears from the pilot’s view, prompting the pilot to abort the landing and execute a go-around. The aircraft later lands safely after weather conditions improve.
Claim
An Instagram user shared the viral video on January 29, 2026, claiming:“Baramati plane crash: video of the aircraft accident surfaces. Runway disappears just three seconds before landing.” (The link to the post, its archived version, and screenshots are provided below.)

Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search using Google Lens. The search led us to the same video uploaded on a YouTube channel named douglesso, which was published on June 12, 2013. (Footage link and screenshot available below.)

Further research led us to a report published by the American media website CNET, which featured the same visual. According to the report, the video shows a Boeing Business Jet attempting to land during heavy rainfall. The aircraft was conducting a CAT I Instrument Landing System (ILS) approach when a sudden downpour drastically reduced visibility at decision height. As the runway briefly disappeared from view, the pilots aborted the landing and carried out a go-around. The aircraft later landed safely once weather conditions improved. (The link to the CNET report and its screenshot are provided below.)
- https://www.cnet.com/culture/this-is-what-happens-when-a-plane-is-landing-and-the-runway-disappears/

Conclusion
Our research confirms that the video circulating on social media is unrelated to any recent aircraft accident involving Maharashtra Deputy Chief Minister Ajit Pawar. The clip is an old video from 2013, which is now being shared with a false and misleading claim.

Executive Summary
A postcard claiming that Uttar Pradesh Deputy Chief Minister Keshav Prasad Maurya commented on the Supreme Court’s stay on the new UGC regulations is being widely shared on social media. The viral postcard suggests that Maurya stated the Modi government would “fight till its last breath” to implement the UGC law and appealed to Dalit, backward and tribal communities to trust the government as their true well-wisher. However, an research by the CyberPeace has found that the viral postcard is fake. Keshav Prasad Maurya has not made any such statement.
Claim
A Facebook user shared the postcard with the caption:“Now read it yourself. Statement of Deputy CM Keshav Prasad Maurya — the Modi government will fight till its last breath to implement the UGC law. An appeal to Dalit, backward and tribal communities to trust the government, calling it their true well-wisher.”
(Archived version of the post available here.)

Fact Check:
During the research, we did not find any credible news reports mentioning such a statement by Deputy Chief Minister Keshav Prasad Maurya regarding the UGC regulations or the Supreme Court’s order. A closer examination of the viral postcard revealed several inconsistencies. Notably, the text on the postcard lacks proper punctuation, such as commas and full stops, which is unusual for professionally designed news graphics. The postcard carries the logo of Navbharat Times (NBT). However, when compared with genuine NBT postcards, the font style used in the viral image does not match NBT’s official design. We also traced the original NBT postcard that appears to have been edited to create the fake one. In the authentic postcard, shared by NBT on January 20, Keshav Prasad Maurya is quoted as saying: Where the lotus has bloomed, it will continue to bloom, and where it has not, under the guidance of PM Modi and the leadership of Nitin Nabin, the lotus will bloom.”

The original statement was digitally altered, and a fabricated quote was inserted to create the viral postcard.
Conclusion
CyberPeace research clearly establishes that the viral postcard is fake. The original Navbharat Times postcard has been tampered with, and Keshav Prasad Maurya’s actual statement has been replaced with a fabricated quote, which is now being circulated with a misleading claim.

Executive Summary
An image showing Prime Minister Narendra Modi and Leader of Opposition in the Lok Sabha and Congress MP Rahul Gandhi standing face to face inside Parliament is going viral on social media. Several users are sharing the image claiming that the photograph was taken during the ongoing Budget Session, suggesting a direct face-off between the two leaders inside Parliament. However, research conducted by the CyberPeacehas found that the viral claim is false. The image in question is not real but has been generated using Artificial Intelligence (AI). The AI-generated image is now being shared on social media with a misleading claim.
Claim
A Facebook user named Madhu Davi shared the viral image on January 30, 2026, with the caption: “If this photo is from today and the Budget Session, it is commendable. RAGA Zindabad.”
(Archived version of the post available here.)
- https://www.facebook.com/photo/?fbid=759145877237871&set=a.110639115421887
- https://perma.cc/N2XD-TZ32?type=image

Fact Check:
To verify the viral claim, we first conducted a keyword search on Google to check whether any credible media outlet had reported such an incident during the Budget Session. However, no news reports supporting the claim were found. We then performed a reverse image search using Google Lens, but this too did not yield any reliable media reports or evidence confirming the authenticity of the image. This raised suspicion that the image might be AI-generated. To further verify, the image was analysed using the AI detection tool Hive Moderation. The tool indicated a probability of over 99 per cent that the image was generated using Artificial Intelligence.

Conclusion
CyberPeace research confirms that the image being circulated with the claim that Prime Minister Narendra Modi and Rahul Gandhi came face to face during the Budget Session is fake. The viral image has been created using AI and is being shared with a false and misleading narrative.

Executive Summary:
A video showing a peacock allegedly trapped in ice has been going viral on social media. In the clip, the peacock appears to be frozen in a snow-covered area. Moments later, a man is seen approaching with a hammer and breaking the ice to rescue the bird. Social media users are sharing the video as a real-life incident, praising the peacock’s resilience and describing the scene as inspiring. However, CyberPeace research found the viral claim to be misleading. Our research revealed that the video was created using Artificial Intelligence (AI) and is being falsely circulated as a real incident.
Claim:
Facebook user ‘Ras Bihari Pathak’ shared the viral video on January 25, 2026, with the caption: “This peacock is not standing on ice, but on courage. It reminds us that no matter how harsh the circumstances are, hope always returns in colours.” The archived version of the post can be accessed here.

Fact Check:
To verify the claim, we first conducted a keyword search on Google to check whether any such real incident involving a peacock trapped in ice had been reported. However, no credible or verified media reports were found. Next, we closely examined the viral video. Upon observation, the peacock’s movements and reactions appeared unnatural and artificial. The motion lacked realistic physical behaviour, raising suspicion that the video might have been digitally generated. To confirm this, we analysed the clip using the AI video detection tool Hive Moderation, which indicated a 99 per cent or higher likelihood that the video was AI-generated.

Conclusion:
CyberPeace research confirms that the viral video showing a peacock allegedly trapped in ice is not real. The clip has been created using Artificial Intelligence and is being shared on social media with a false and misleading claim.

Executive Summary
A video featuring Uttar Pradesh Chief Minister Yogi Adityanath is being widely shared on social media. In the video, Adityanath can be heard saying, “Let me become the Prime Minister, and Pakistan-occupied Kashmir will also become a part of India.” The video also carries an on-screen text that reads “Next PM 2029.” By sharing this clip, social media users are claiming that Yogi Adityanath is set to become India’s Prime Minister in 2029.
However, CyberPeace research found the viral claim to be misleading. Our research revealed that the video circulating online has been edited and is being shared out of context. The original video dates back to May 2024. In the original footage, Yogi Adityanath is not speaking about himself. Instead, he is referring to Prime Minister Narendra Modi.
In the original statement, Adityanath says:
“Let Modi ji become Prime Minister for the third time, and within the next six months, Pakistan-occupied Kashmir will also become a part of India.”
It is evident that the video has been trimmed and misleading text has been added to falsely portray the statement as a declaration about Yogi Adityanath becoming Prime Minister in 2029.
Claim
A YouTube user shared the viral video on January 29, 2026, claiming that Yogi Adityanath said, “Let me become Prime Minister, and Pakistan-occupied Kashmir will be part of India.” The video carries the caption “Next PM 2029,” suggesting that Adityanath is set to become the Prime Minister in 2029.
Link to the post n archive

Fact Check:
To verify the viral claim, we first conducted a keyword search on Google. During this process, we found a report published by Aaj Tak on May 18, 2024. According to the report, Yogi Adityanath stated that if Narendra Modi becomes Prime Minister for the third time, Pakistan-occupied Kashmir would become part of India within six months.
Report link:

Next, we extracted keyframes from the viral video and ran them through Google Lens. This led us to the official YouTube channel of Yogi Adityanath, where the same video was uploaded on May 18, 2024.
Original video link:

In the original video, Yogi Adityanath clearly makes the statement in reference to Prime Minister Narendra Modi, not himself.Finally, we compared the viral clip with the original footage. The visuals in both videos are identical; however, the viral version has been edited and overlaid with misleading text to change the meaning of the statement.
Conclusion
Our research confirms that the viral video is edited and misleading. The original video is from May 2024, in which Yogi Adityanath was speaking about Prime Minister Narendra Modi, not about himself becoming Prime Minister in 2029. The video has been falsely altered and shared with a deceptive claim on social media.