Navigating the Path to CyberPeace: Insights and Strategies
Featured #factCheck Blogs

Executive Summary
A video is going viral on social media showing a massive building engulfed in flames and collapsing into debris. It is being widely claimed that Iran launched a powerful attack that destroyed Israel’s army headquarters. However, research by CyberPeace reveals that this claim is misleading. The viral video is AI-generated and has no connection to any real-world event.
Claim
An X (formerly Twitter) user shared the viral video with the caption: “Iran has targeted Israel’s army headquarters. It seems Israel’s dream of becoming ‘Greater Israel’ will remain unfulfilled.”
Post link:
- https://x.com/KAMESHKUMAR96/status/2039009484069368083
Archived version:
- https://archive.ph/HKXkK
- https://x.com/KAMESHKUMAR96/status/2039009484069368083
- https://archive.ph/HKXkK

Similar videos have also been shared by other users on social media:
Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search. During this process, we found several credible media reports confirming that Iran has carried out drone and missile attacks on Israel and the Gulf regions in recent times. However, none of these reports featured the viral video, indicating that it is not authentic footage.

- https://www.youtube.com/watch?v=fxDBX90bYng

A closer examination of the video revealed multiple visual inconsistencies commonly associated with AI-generated content. For instance, a building on the left side appears to bend and collapse in a rubber-like manner—something that is physically unrealistic for structures made of concrete and steel. Additionally, the smoke and flames appear unnatural and lack realistic dynamics.
To further verify, we analyzed the video using the AI detection tool Hive Moderation, which classified it as 99.9% AI-generated.

We also tested the video using the Deepfake-o-Meter platform.The AVSRDD (2025) model detected it as 99.5% AI-generated

Conclusion
Our research clearly establishes that the viral video claiming Iran destroyed Israel’s army headquarters is false and misleading. The footage does not appear in any credible news coverage of recent attacks, which strongly indicates that it is not real. Moreover, multiple AI detection tools consistently classify the video as artificially generated, with extremely high probability scores. Visual anomalies in the clip further support this finding.

Executive Summary:
A video is rapidly circulating on social media following claims that Iran’s national security chief Ali Larijani was killed in an Israeli airstrike. The viral clip is being shared with the assertion that it shows the moment Israel launched a powerful attack on Iran to eliminate Larijani, allegedly shaking the ground due to the intensity of the strike However, research by CyberPeace has found the claim to be misleading. The viral video is AI-generated and has no connection to real-world events.
Claim:
Social media users have shared the video with alarming captions. One such post by Deepak Sharma reads:
“WAR UPDATE… Iran is in its final phase… Israel is striking selectively… This attack will leave you shocked… Iran’s national security chief Ali Larijani has been killed in this attack… The intensity of the strike shook the Iranian الأرض.
Post links:

Similar videos were also shared by other users:
- urabh_raj3026/status/2038834832869032026
- https://x.com/ibmindia20/status/2038938020154597447
- https://x.com/Saurabh_raj3026/status/2038834832869032026
Fact Check
To verify the claim, we extracted keyframes from the viral video and conducted a reverse image search. During this process, we found the same video on Instagram, uploaded on March 9, 2026, by the account “_iranwar_2026” without any descriptive caption.

According to a BBC report, Ali Larijani died on March 17 in an Israeli strike. This establishes that the viral video predates the reported incident, making the claim factually inconsistent. Further examination of the Instagram account revealed that it frequently shares pro-Iran content, including gaming visuals and AI-generated clips, raising doubts about the authenticity of the video.

To strengthen the verification, we analyzed the viral clip using the AI detection tool “Zhuque AI Detection Assistant.” The results indicated a 91.71% probability that the video is AI-generated, confirming that it is not real footage.

Conclusion
The viral claim linking the video to an Israeli airstrike that allegedly killed Ali Larijani is misleading and factually incorrect. Multiple layers of verification show that the video existed online before the reported incident, ruling out any direct connection. Additionally, AI detection analysis strongly suggests that the video is artificially generated. The source account’s pattern of sharing AI and gaming-related content further weakens the credibility of the claim. There is no verified evidence to support that the viral clip depicts a real attack or any event related to Larijani’s death. Instead, the video appears to be a digitally created visual circulated without context to amplify misinformation.

A video circulating widely on social media claims that Defence Minister Rajnath Singh compared the Rashtriya Swayamsevak Sangh (RSS) with the Afghan Taliban. The clip allegedly shows Singh stating that both organisations share a common ideology and belief system and therefore “must walk together.” However, a research by the CyberPeace found that the video is digitally manipulated, and the audio attributed to Rajnath Singh has been fabricated using artificial intelligence.
Claim
An X user, Aamir Ali Khan (@Aamir_Aali), on January 20 shared a video of Defence Minister Rajnath Singh, claiming that he drew parallels between the Rashtriya Swayamsevak Sangh (RSS) and the Afghan Taliban. The user alleged that Singh stated both organisations follow a similar ideology and belief system and therefore must “walk together.” The post further quoted Singh as allegedly saying: “Indian RSS & Afghan Taliban have one ideology, we have one faith, we have one alliance, our mutual enemy is Pakistan. Israel is a strategic partner of India & Afghan Taliban are Israeli friends. We must join hands to destroy the enemy Pakistan.” Here is the link and archive link to the post, along with a screenshot.

Fact Check:
To verify the claim, the CyberPeace conducted a Google Lens search using keyframes extracted from the viral video. This search led to an extended version of the same footage uploaded on the official YouTube channel of Rajnath Singh. The original video was traced back to the inaugural ceremony of the Medium Calibre Ammunition Facility, constructed by Solar Industries in Nagpur. Upon reviewing the complete, unedited speech, the Desk found no instance where Rajnath Singh made any remarks comparing the RSS with the Afghan Taliban or spoke about shared ideology, alliances, or Pakistan in the manner claimed.
In the authentic footage, the Defence Minister spoke about:
" India’s push for Aatmanirbharta (self-reliance) in defence manufacturing
Strengthening domestic ammunition production
Positioning India as a global hub for defence exports "
The statements attributed to him in the viral clip were entirely absent from the original speech.
Here is the link to the original video, along with a screenshot.

In the next stage of the research , the audio track from the viral video was extracted and analysed using the AI voice detection tool Aurigin. This confirmed that the original visuals were misused and overlaid with a synthetic voice track to create a misleading narrative.

Conclusion
The CyberPeace concluded that the viral video claiming Defence Minister Rajnath Singh compared the RSS with the Afghan Taliban is false and misleading. The video has been digitally manipulated, with an AI-generated audio track falsely attributed to Singh. The Defence Minister made no such remarks during the Nagpur event, and the claim circulating online is fabricated.

A video of Bollywood actor Salman Khan is being widely circulated on social media, in which he can allegedly be heard saying that he will soon join Asaduddin Owaisi’s party, the All India Majlis-e-Ittehadul Muslimeen (AIMIM). Along with the video, a purported image of Salman Khan with Asaduddin Owaisi is also being shared. Social media users are claiming that Salman Khan is set to join the AIMIM party.
CyberPeace research found the viral claim to be false. Our research revealed that Salman Khan has not made any such statement, and that both the viral video and the accompanying image are AI-generated.
Claim
Social media users claim that Salman Khan has announced his decision to join AIMIM.On 19 January 2026, a Facebook user shared the viral video with the caption, “What did Salman say about Owaisi?” In the video, Salman Khan can allegedly be heard saying that he is going to join Owaisi’s party. (The link to the post, its archived version, and screenshots are available.)

Fact Check:
To verify the claim, we first searched Google using relevant keywords. However, no credible or reliable media reports were found supporting the claim that Salman Khan is joining AIMIM.

In the next step of verification, we extracted key frames from the viral video and conducted a reverse image search using Google Lens. This led us to a video posted on Salman Khan’s official Instagram account on 21 April 2023. In the original video, Salman Khan is seen talking about an event scheduled to take place in Dubai. A careful review of the full video confirmed that no statement related to AIMIM or Asaduddin Owaisi is made.

Further analysis of the viral clip revealed that Salman Khan’s voice sounds unnatural and robotic. To verify this, we scanned the video using AURGIN AI, an AI-generated content detection tool. According to the tool’s analysis, the viral video was generated using artificial intelligence.

Conclusion
Salman Khan has not announced that he is joining the AIMIM party. The viral video and the image circulating on social media are AI-generated and manipulated.

A news graphic bearing the Navbharat Times logo is being widely circulated on social media. The graphic claims that religious preacher Devkinandan Thakur made an extremely offensive and casteist remark targeting the ‘Shudra’ community. Social media users are sharing the graphic and claiming that the statement was actually made by Devkinandan Thakur. Cyber Peace Foundation’s research and verification found that the claim being shared online is misleading. Our research found that the viral news graphic is completely fake and that Devkinandan Thakur did not make any such casteist statement.
Claim
A viral news graphic claims that Devkinandan Thakur made a derogatory and caste-based statement about Shudras.On 17 January 2026, an Instagram user shared the viral graphic with the caption, “This is probably the formula of Ram Rajya.”The text on the graphic reads: “People of Shudra castes reproduce through sexual intercourse, whereas Brahmins give birth to children after marriage through the power of their mantras, without intercourse.” The graphic also carries Devkinandan Thakur’s photograph and identifies him as a ‘Kathavachak’ (religious storyteller).

Fact Check:
To verify the claim, we first searched for relevant keywords on Google. However, no credible or verified media reports were found supporting the claim. In the next stage of verification, we found a post published by NBT Hindi News (Navbharat Times) on X (formerly Twitter) on 17 January 2026, in which the organisation explicitly debunked the viral graphic. Navbharat Times clarified that the graphic circulating online was fake and also shared the original and authentic post related to the news.

Further research led us to Devkinandan Thakur’s official Facebook account, where he posted a clarification on 17 January 2026. In his post, he stated that anti-social elements are creating fake ‘Sanatani’ profiles and spreading false news, misusing the names of reputed media houses and platforms to mislead and divide people. He described the viral content as part of a deliberate conspiracy and fake agenda aimed at weakening unity. He also warned that AI-generated fake videos and fabricated statements are increasingly being used to create confusion, mistrust and division.
Devkinandan Thakur urged people not to believe or share any post, news or video without verification, and advised checking information through official websites, verified social media accounts or trusted sources.

Conclusion
The viral news graphic attributing a casteist statement to Devkinandan Thakur is completely fake.Devkinandan Thakur did not make the alleged remark, and the graphic circulating with the Navbharat Times logo is fabricated.

An image showing a damaged statue of Mahatma Gandhi, broken into two pieces, is being widely shared on social media. The image shows Gandhi’s statue with its head separated from the body, prompting strong reactions online.
Social media users are claiming that the incident occurred in Bangladesh, alleging that Mahatma Gandhi’s statue was deliberately vandalised there. The image is being described as a recent incident and is being circulated across platforms with provocative and inflammatory captions.
Cyber Peace Foundation’s research and verification found that the claim being shared online is misleading. Our rsearch revealed that the viral image is not from Bangladesh. The image is actually from Chakulia in Uttar Dinajpur district of West Bengal, India
Claim:
Social media users claim that Mahatma Gandhi’s statue was vandalised in Bangladesh, and that the viral image shows a recent incident from the country.One Facebook user shared the video on 19 January 2026, making derogatory remarks and falsely linking the incident to Bangladesh. The post has since been widely shared on social media platforms. (Archived links and screenshots are available.)

Fact Check:
Our research revealed that the viral image is not from Bangladesh. The image is actually from Chakulia in Uttar Dinajpur district of West Bengal, India. To verify the claim, we conducted a reverse image search using Google Lens on key frames from the viral video. This led us to a report published by ABP Live Bangla on 16 January 2026, which featured the same visuals. Link and screenshot

According to ABP Live Bangla, the statue of Mahatma Gandhi was damaged during a protest in Chakulia. The statue’s head was found separated from the body. While a portion of the broken statue remained at the site on Thursday night, it was reported missing by Friday morning. The report further stated that extensive damage was observed at BDO Office No. 2 in Golpokhar. Gandhi’s statue, located at the entrance of the administrative building, was found broken, and ashes were discovered near the premises. Government staff were seen clearing scattered debris from the site.
The incident reportedly occurred during a SIR (Special Intensive Revision) hearing at the BDO office, which was disrupted due to vandalism. In connection with the violence and damage to government property, 21 people have been arrested so far. In the next stage of verification, we found the same footage in a 16 January 2026 report by local Bengali news channel K TV, which also showed clear visuals of the damaged Mahatma Gandhi statue. Link and screenshot.

Conclusion:
The viral image of Mahatma Gandhi’s broken statue does not depict an incident from Bangladesh. The image is from Chakulia in West Bengal’s Uttar Dinajpur district, where the statue was damaged during a protest.

A photo featuring Bollywood actor Abhishek Bachchan and actress Aishwarya Rai is being widely shared on social media. In the image, the Kedarnath Temple is clearly visible in the background. Users are claiming that the couple recently visited the Kedarnath shrine for darshan.
Cyber Peace Foundation’s research found the viral claim to be false. Our research revealed that the image of Abhishek Bachchan and Aishwarya Rai is not real, but AI-generated, and is being misleadingly shared as a genuine photograph.
Claim
On January 14, 2026, a user on X (formerly Twitter) shared the viral image with a caption suggesting that all rumours had ended and that the couple had restarted their life together. The post further claimed that both actors were seen smiling after a long time, implying that the image was taken during their visit to Kedarnath Temple.
The post has since been widely circulated on social media platforms

Fact Check:
To verify the claim, we first conducted a keyword search on Google related to Abhishek Bachchan, Aishwarya Rai, and a Kedarnath visit. However, we did not find any credible media reports confirming such a visit.
On closely examining the viral image, several visual inconsistencies raised suspicion about it being artificially generated. To confirm this, we scanned the image using the AI detection tool Sightengine. According to the tool’s analysis, the image was found to be 84 percent AI-generated.

Additionally, we scanned the same image using another AI detection tool, HIVE Moderation. The results showed an even stronger indication, classifying the image as 99 percent AI-generated.

Conclusion
Our research confirms that the viral image showing Abhishek Bachchan and Aishwarya Rai at Kedarnath Temple is not authentic. The picture is AI-generated and is being falsely shared on social media to mislead users.

A video circulating widely on social media shows a child throwing stones at a moving train, while a few other children can also be seen climbing onto the engine. The video is being shared with a communal narrative, with claims that the incident took place in India.
Cyber Peace Foundation’s research found the viral claim to be misleading. Our research revealed that the video is not from India, but from Bangladesh, and is being falsely linked to India on social media.
Claim:
On January 15, 2026, a Facebook user shared the viral video claiming it depicted an incident from India. The post carried a provocative caption stating, “We are not afraid of Pakistan outside our borders. We are afraid of the thousands of mini-Pakistans within India.” The post has been widely circulated, amplifying communal sentiments.

Fact Check:
To verify the authenticity of the video, we conducted a reverse image search using Google Lens by extracting keyframes from the viral clip. During this process, we found the same video uploaded on a Bangladeshi Facebook account named AL Amin Babukhali on December 28, 2025. The caption of the original post mentions Kamalapur, which is a well-known railway station in Bangladesh. This strongly indicates that the incident did not occur in India.

Further analysis of the video shows that the train engine carries the marking “BR”, along with text written in the Bengali language. “BR” stands for Bangladesh Railways, confirming the origin of the train. To corroborate this further, we searched for images related to Bangladesh Railways using Google’s open tools. We found multiple images on Getty Images showing train engines with the same design and markings as seen in the viral video. The visual match clearly establishes that the train belongs to Bangladesh Railways.

Conclusion
Our research confirms that the viral video is from Bangladesh, not India. It is being shared on social media with a false and misleading claim to give it a communal angle and link it to India.

A video is being widely shared on social media showing devotees seated in a boat appearing stunned as a massive, multi-hooded snake—resembling the mythical Sheshnag—suddenly emerges from the middle of a water body.
The video captures visible panic and astonishment among the devotees. Social media users are sharing the clip claiming that it is from Vrindavan, with some portraying the sight as a divine or supernatural event. However, research conducted by the Cyber Peace Foundation found the viral claim to be false. Our research revealed that the video is not authentic and has been generated using artificial intelligence (AI).
Claim
On January 17, 2026, a user shared the viral video on Instagram with the caption suggesting that God had appeared again in the age of Kalyug. The post claims that a terrifying video from Vrindavan has surfaced in which devotees sitting in a boat were shocked to see a massive multi-hooded snake emerge from the water. The caption further states that devotees are hailing the creature as an incarnation of Sheshnag or Vasuki Nag, raising religious slogans and questioning whether the sight represents a divine sign. (The link to the post, its archive link, and screenshots are available.)
- https://www.instagram.com/reel/DTngN9FkoX0/?igsh=MTZvdTN1enI2NnFydA%3D%3D
- https://archive.ph/UuAqB
Fact Check:
Upon closely examining the viral video, we suspected that it might be AI-generated. To verify this, the video was scanned using the AI detection tool SIGHTENGINE, which indicated that the visual is 99 per cent AI-generated.

In the next step of the research , the video was analysed using another AI detection tool, HIVE Moderation. According to the results obtained, the video was found to be 62 per cent AI-generated.

Conclusion
Our research clearly establishes that the viral video claiming to show a multi-hooded snake in Vrindavan is not real. The clip has been created using artificial intelligence and is being falsely shared on social media with religious and sensational claims.

Assembly elections are due to be held in Assam later this year, with polling likely in April or May. Ahead of the elections, a video claiming to be an Aaj Tak news bulletin is being widely circulated on social media.
In the viral video, Aaj Tak anchor Rajiv Dhoundiyal is allegedly seen stating that a leaked intelligence report has issued a warning for the ruling Bharatiya Janata Party (BJP) in Assam. The clip claims that according to this purported report, the BJP may suffer significant losses in the upcoming Assembly elections. Several social media users sharing the video have also claimed that the alleged intelligence report signals the possible removal of Assam Chief Minister Himanta Biswa Sarma from office.
However, an investigation by the Cyber Peace Foundation found the viral claim to be false. Our probe clearly established that no leaked intelligence report related to the Assam Assembly elections exists.
Further, Aaj Tak has neither published nor broadcast any such report on its official television channel, website, or social media platforms. The investigation also revealed that the viral video itself is not authentic and has been created using deepfake technology.
Claim
On social media platform Facebook, a user shared the viral video claiming that the BJP has been pushed on the back foot following organisational changes in the Congress—appointing Priyanka Gandhi Vadra as chairperson of the election screening committee and Gaurav Gogoi as the Assam Pradesh Congress Committee president. The post further claims that an Intelligence Bureau report predicts that the current Assam government will not return to power.
(Link to the post, archive link, and screenshots are available.)
FactCheck:
To verify the claim, we first searched for reports related to any alleged leaked intelligence assessment concerning the Assam Assembly elections using relevant keywords. However, no credible or reliable reports supporting the claim were found. We then reviewed Aaj Tak’s official website, social media pages, and YouTube channel. Our examination confirmed that no such news bulletin has been published or broadcast by the network on any of its official platforms.
- https://www.facebook.com/aajtak/?locale=hi_IN
- https://www.instagram.com/aajtak/
- https://x.com/aajtak
- https://www.youtube.com/channel/UCt4t-jeY85JegMlZ-E5UWtA
To further verify the authenticity of the video, its audio was scanned using the deepfake voice detection tool HIVE Moderation.
The analysis revealed that the voice heard in the video is 99 per cent AI-generated, clearly indicating that the audio is not genuine and has been artificially created using artificial intelligence.

Additionally, the video was analysed using another AI detection tool, Aurigin AI, which also identified the viral clip as AI-generated.

Conclusion:
The investigation clearly establishes that there is no leaked intelligence report predicting BJP’s defeat in the Assam Assembly elections. Aaj Tak has not published or broadcast any such content on its official platforms. The video circulating on social media is not authentic and has been created using deepfake technology to mislead viewers.

A photograph showing a massive crowd on a road is being widely shared on social media. The image is being circulated with the claim that people in the United States are staging large-scale protests against President Donald Trump.
However, CyberPeace Foundation’s research has found this claim to be misleading. Our fact-check reveals that the viral photograph is nearly eight years old and has been falsely linked to recent political developments.
Claim:
Social media users are sharing a photograph and claiming that it shows people protesting against US President Donald Trump.An X (formerly Twitter) user, Salman Khan Gauri (@khansalman88177), shared the image with the caption:“Today, a massive protest is taking place in America against Donald Trump.”
The post can be viewed here, and its archived version is available here.

FactCheck:
To verify the claim, we conducted a reverse image search of the viral photograph using Google. This led us to a report published by The Mercury News on April 6, 2018.
The report features the same image and states that the photograph was taken on March 24, 2018, during the ‘March for Our Lives’ rally in Washington, DC. The rally was organized to demand stricter gun control laws in the United States. The image shows a large crowd gathered on Pennsylvania Avenue in support of gun reform.
The report further notes that the Associated Press, on March 30, 2018, debunked false claims circulating online which alleged that liberal billionaire George Soros and his organizations had paid protesters $300 each to participate in the rally.

Further research led us to a report published by The Hindu on March 25, 2018, which also carries the same photograph. According to the report, thousands of Americans across the country participated in ‘March for Our Lives’ rallies following a mass shooting at a school in Florida. The protests were led by survivors and victims, demanding stronger gun laws.
The objective of these demonstrations was to break the legislative deadlock that has long hindered efforts to tighten firearm regulations in a country frequently rocked by mass shootings in schools and colleges.

Conclusion
The viral photograph is nearly eight years old and is unrelated to any recent protests against President Donald Trump.The image actually depicts a gun control protest held in 2018 and is being falsely shared with a misleading political claim.By circulating this outdated image with an incorrect context, social media users are spreading misinformation.

Social media users are widely sharing a video claiming to show an aircraft carrier being destroyed after getting trapped in a massive sea storm. In the viral clip, the aircraft carrier can be seen breaking apart amid violent waves, with users describing the visuals as a “wrath of nature.”
However, CyberPeace Foundation’s research has found this claim to be false. Our fact-check confirms that the viral video does not depict a real incident and has instead been created using Artificial Intelligence (AI).
Claim:
An X (formerly Twitter) user shared the viral video with the caption,“Nature’s wrath captured on camera.”The video shows an aircraft carrier appearing to be devastated by a powerful ocean storm. The post can be viewed here, and its archived version is available here.
https://x.com/Maailah1712/status/2011672435255624090

Fact Check:
At first glance, the visuals shown in the viral video appear highly unrealistic and cinematic, raising suspicion about their authenticity. The exaggerated motion of waves, structural damage to the vessel, and overall animation-like quality suggest that the video may have been digitally generated. To verify this, we analyzed the video using AI detection tools.
The analysis conducted by Hive Moderation, a widely used AI content detection platform, indicates that the video is highly likely to be AI-generated. According to Hive’s assessment, there is nearly a 90 percent probability that the visual content in the video was created using AI.

Conclusion
The viral video claiming to show an aircraft carrier being destroyed in a sea storm is not related to any real incident.It is a computer-generated, AI-created video that is being falsely shared online as a real natural disaster. By circulating such fabricated visuals without verification, social media users are contributing to the spread of misinformation.

A video is being shared on social media, falsely attributing it to Australian Prime Minister Anthony Albanese. The video claims that following the Bondi Beach attack, he decided to cancel the visas of Pakistani citizens.
An investigation by the Cyber Peace Foundation revealed that the viral video was created using AI. In the original video, Anthony Albanese was answering questions related to the Climate Change Bill during a press conference. It is important to note that in the attack that took place last Sunday (14 December) at Bondi Beach in Sydney, New South Wales, Australia, 15 people were killed. According to Australian police, the attack targeted the Jewish community. New South Wales Police Commissioner Mal Lanyon stated that the two accused involved in the attack were father and son—one aged 50 and the other 24. Media reports identified them as Sajid and Naved Akram.
Claim:
On 14 December 2025, a user on the social media platform X shared a video claiming, “After the attack by a Pakistani Islamic terrorist, the Australian Prime Minister has decided to cancel the visas of all Pakistanis. The whole world is troubled by this community, and in India it is said that Abdul cannot buy a house in a Hindu neighbourhood.”
The link to the related post, its archived version, and screenshots can be seen below:

Investigation:Upon closely examining the viral video, we suspected it to be AI-generated. Subsequently, we scanned the video using the AI detection tool aurigin.ai. According to the results provided by the tool, the video was found to be AI-generated.

A video clip of journalist Palki Sharma is being widely shared on social media. Along with the video, it is being claimed that during Prime Minister Narendra Modi’s recent Middle East visit, she questioned Jordan’s diplomatic protocol.
In the viral clip, Palki Sharma is allegedly seen asking why Jordan’s King Abdullah II did not come to the airport to receive Prime Minister Modi, and whether this indicated a downgrade in the level of welcome.
However, an investigation by the Cyber Peace Foundation found this claim to be misleading. The probe revealed that while the visuals in the viral video are genuine, the audio has been altered using Artificial Intelligence (AI).
On the social media platform ‘X’, a user named “Ammar Solangi” shared this video on 18 December. The post claimed that the video was related to questions raised about Jordan’s diplomatic protocol during Prime Minister Modi’s visit. According to the post, Palki Sharma questioned why King Abdullah II did not receive Prime Minister Modi at the airport. The archive link of the viral post can be seen here: https://ghostarchive.org/archive/26aK0
Verification
During the investigation, the fact-check desk noticed the ‘Firstpost’ logo in the top-left corner of the viral video. Based on this clue, a customized Google search was conducted, which led to the original news report.
The investigation revealed that the viral video was taken from an episode of journalist Palki Sharma’s show “Vantage with Palki Sharma”, which aired on 17 December.
Analysis of the video showed that the visuals appearing at the 33 minutes 30 seconds timestamp in the original report exactly match those used in the viral clip. However, in the original broadcast, Palki Sharma neither questioned Jordan’s protocol nor made any comment about King Abdullah II not being present at the airport.
In the original video, Palki Sharma says:
“Prime Minister Modi was on a diplomatic tour of Jordan, Ethiopia, and Oman, and in Jordan he was received at the airport by the country’s Prime Minister…” The link to the original report can be seen here: https://www.youtube.com/watch?v=-VYZYe9l6Bs

AI Audio Examination
Further investigation involved separating the audio from the viral video and analyzing it using the AI voice detection tool ‘Resemble AI’. The tool’s results confirmed that fake, AI-generated audio had been added over the real footage in the viral clip to spread a misleading claim. A screenshot of the results from this examination can be seen below.

Conclusion
The video being circulated in the name of journalist Palki Sharma has been tampered with. Her voice has been altered using AI technology, and the claim made regarding the Jordan visit is completely misleading.