#FactCheck - Debunking Viral Photo: Tears of Photographer Not Linked to Ram Mandir Opening
Executive Summary:
A photographer breaking down in tears in a viral photo is not connected to the Ram Mandir opening. Social media users are sharing a collage of images of the recently dedicated Lord Ram idol at the Ayodhya Ram Mandir, along with a claimed shot of the photographer crying at the sight of the deity. A Facebook post that posts this video says, "Even the cameraman couldn't stop his emotions." The CyberPeace Research team found that the event happened during the AFC Asian Cup football match in 2019. During a match between Iraq and Qatar, an Iraqi photographer started crying since Iraq had lost and was out of the competition.
Claims:
The photographer in the widely shared images broke down in tears at seeing the icon of Lord Ram during the Ayodhya Ram Mandir's consecration. The Collage was also shared by many users in other Social Media like X, Reddit, Facebook. An Facebook user shared and the Caption of the Post reads,
Fact Check:
CyberPeace Research team reverse image searched the Photographer, and it landed to several memes from where the picture was taken, from there we landed to a Pinterest Post where it reads, “An Iraqi photographer as his team is knocked out of the Asian Cup of Nations”
Taking an indication from this we did some keyword search and tried to find the actual news behind this Image. We landed at the official Asian Cup X (formerly Twitter) handle where the image was shared 5 years ago on 24 Jan, 2019. The Post reads, “Passionate. Emotional moment for an Iraqi photographer during the Round of 16 clash against ! #AsianCup2019”
We are now confirmed about the News and the origin of this image. To be noted that while we were investigating the Fact Check we also found several other Misinformation news with the Same photographer image and different Post Captions which was all a Misinformation like this one.
Conclusion:
The recent Viral Image of the Photographer claiming to be associated with Ram Mandir Opening is Misleading, the Image of the Photographer was a 5 years old image where the Iraqi Photographer was seen Crying during the Asian Cup Football Competition but not of recent Ram Mandir Opening. Netizens are advised not to believe and share such misinformation posts around Social Media.
- Claim: A person in the widely shared images broke down in tears at seeing the icon of Lord Ram during the Ayodhya Ram Mandir's consecration.
- Claimed on: Facebook, X, Reddit
- Fact Check: Fake
Related Blogs
Introduction
Romance scams have been rised in India. A staggering 66 percent of individuals in India have been ensnared by the siren songs of deceitful online dating schemes. These are not the attempts of yesteryears but rather a new breed of scams, seamlessly weaving the threads of traditional deceit with the sinew of cutting-edge technologies such as generative AI and deep fakes. A report by Tenable highlights the rise of romance scams in India, which now combine traditional tactics with advanced technologies like generative AI and deepfakes. Over 69% of Indians struggle to distinguish between artificial and authentic human voices. Scammers are using celebrity impersonations and platforms like Facebook to lure victims into a false sense of security.
The Romance Scam
A report by Tenable, the exposure management company, illuminates the disturbing evolution of these romance scams. It reveals a reality: AI-generated deep lakes have attained a level of sophistication where an astonishing 69 percent of Indians confess to struggling to discern between artificial and authentic human voices. This technological prowess has armed scammers with the tools to craft increasingly convincing personas, enabling them to perpetrate their nefarious acts with alarming success.
In 2023 alone, 43 percent of Indians reported falling victim to AI voice scams, with a staggering 83 percent of those targeted suffering financial loss. The scammers, like puppeteers, manipulate their digital marionettes with a deftness that is both awe-inspiring and horrifying. They have mastered the art of impersonating celebrities and fabricating personas that resonate with their targets, particularly preying on older demographics who may be more susceptible to their charms.
Social media platforms, which were once heralded as the town squares of the 21st century, have unwittingly become fertile grounds for these fraudulent activities. They lure victims into a false sense of security before the scammers orchestrate their deceitful symphonies. Chris Boyd, a staff research engineer at Tenable, issues a stern warning against the lure of private conversations, where the protective layers of security are peeled away, leaving individuals exposed to the machinations of these digital charlatans.
The Vulnerability of Individuals
The report highlights the vulnerability of certain individuals, especially those who are older, widowed, or experiencing memory loss. These individuals are systematically targeted by heartless criminals who exploit their longing for connection and companionship. The importance of scrutinising requests for money from newfound connections is underscored, as is the need for meticulous examination of photographs and videos for any signs of manipulation or deceit.
'Increasing awareness and maintaining vigilance are our strongest weapons against these heartless manipulations, 'safeguarding love seekers from the treacherous web of AI-enhanced deception.'
The landscape of love has been irrevocably altered by the prevalence of smartphones and the deep proliferation of mobile internet. Finding love has morphed into a digital odyssey, with more and more Indians turning to dating apps like Tinder, Bumble, and Hinge. Yet, as with all technological advancements, there lurks a shadowy underbelly. The rapid adoption of dating sites has provided potential scammers with a veritable goldmine of opportunity.
It is not uncommon these days to hear tales of individuals who have lost their life savings to a person they met on a dating site or who have been honey-trapped and extorted by scammers on such platforms. A new study, titled 'Modern Love' and published by McAfee ahead of Valentine's Day 2024, reveals that such scams are rampant in India, with 39 percent of users reporting that their conversations with a potential love interest online turned out to be with a scammer.
The study also found that 77 percent of Indians have encountered fake profiles and photos that appear AI-generated on dating websites or apps or on social media, while 26 percent later discovered that they were engaging with AI-generated bots rather than real people. 'The possibilities of AI are endless, and unfortunately, so are the perils,' says Steve Grobman, McAfee’s Chief Technology Officer.
Steps to Safeguard
Scammers have not limited their hunting grounds to dating sites alone. A staggering 91 percent of Indians surveyed for the study reported that they, or someone they know, have been contacted by a stranger through social media or text message and began to 'chat' with them regularly. Cybercriminals exploit the vulnerability of those seeking love, engaging in long and sophisticated attempts to defraud their victims.
McAfee offers some steps to protect oneself from online romance and AI scams:
- Scrutinise any direct messages you receive from a love interest via a dating app or social media.
- Be on the lookout for consistent, AI-generated messages which often lack substance or feel generic.
- Avoid clicking on any links in messages from someone you have not met in person.
- Perform a reverse image search of any profile pictures used by the person.
- Refrain from sending money or gifts to someone you haven’t met in person, even if they send you money first.
- Discuss your new love interest with your trusted friend. It can be easy to overlook red flags when you are hopeful and excited.
Conclusion
The path is fraught with illusions, and only by arming oneself with knowledge and scepticism can one hope to find true connection without falling prey to the mirage of deceit. As we navigate this treacherous terrain, let us remember that the most profound connections are often those that withstand the test of time and the scrutiny of truth.
References
- https://www.businesstoday.in/technology/news/story/valentine-day-alert-deepfakes-genai-amplifying-romance-scams-in-india-warn-researchers-417245-2024-02-13
- https://www.indiatimes.com/amp/news/india/valentines-day-around-40-per-cent-indians-have-been-scammed-while-looking-for-love-online-627324.html
- https://zeenews.india.com/technology/valentine-day-deepfakes-in-romance-scams-generative-ai-in-scams-romance-scams-in-india-online-dating-scams-in-india-ai-voice-scams-in-india-cyber-security-in-india-2720589.html
- https://www.mcafee.com/en-us/consumer-corporate/newsroom/press-releases/2023/20230209.html
Executive Summary:
A recent claim going around on social media that a child created sand sculptures of cricket legend Mahendra Singh Dhoni, has been proven false by the CyberPeace Research Team. The team discovered that the images were actually produced using an AI tool. Evident from the unusual details like extra fingers and unnatural characteristics in the sculptures, the Research Team discerned the likelihood of artificial creation. This suspicion was further substantiated by AI detection tools. This incident underscores the need to fact-check information before posting, as misinformation can quickly go viral on social media. It is advised everyone to carefully assess content to stop the spread of false information.
Claims:
The claim is that the photographs published on social media show sand sculptures of cricketer Mahendra Singh Dhoni made by a child.
Fact Check:
Upon receiving the posts, we carefully examined the images. The collage of 4 pictures has many anomalies which are the clear sign of AI generated images.
In the first image the left hand of the sand sculpture has 6 fingers and in the word INDIA, ‘A’ is not properly aligned i.e not in the same line as other letters. In the second image, the finger of the boy is missing and the sand sculpture has 4 fingers in its front foot and has 3 legs. In the third image the slipper of the boy is not visible whereas some part of the slipper is visible, and in the fourth image the hand of the boy is not looking like a hand. These are some of the major discrepancies clearly visible in the images.
We then checked using an AI Image detection tool named ‘Hive’ image detection, Hive detected the image as 100.0% AI generated.
We then checked it in another AI image detection named ContentAtScale AI image detection, and it found to be 98% AI generated.
From this we concluded that the Image is AI generated and has no connection with the claim made in the viral social media posts. We have also previously debunked AI Generated artwork of sand sculpture of Indian Cricketer Virat Kohli which had the same types of anomalies as those seen in this case.
Conclusion:
Taking into consideration the distortions spotted in the images and the result of AI detection tools, it can be concluded that the claim of the pictures representing the child's sand sculptures of cricketer Mahendra Singh Dhoni is false. The pictures are created with Artificial Intelligence. It is important to check and authenticate the content before posting it to social media websites.
- Claim: The frame of pictures shared on social media contains child's sand sculptures of cricket player Mahendra Singh Dhoni.
- Claimed on: X (formerly known as Twitter), Instagram, Facebook, YouTube
- Fact Check: Fake & Misleading
Introduction
You must have heard of several techniques of cybercrime up to this point. Many of which we could never have anticipated. Some of these reports are coming from different parts of the country. Where video calls are being utilised to cheat. Through video calls, cybercriminals are making individuals victims of fraud. During this incident, fraudsters film pornographic recordings of both the victims using a screen recorder, then blackmail them by emailing these videos and demanding money. However, cybercriminals are improving their strategies to defraud more people. In this blog post, we will explore the tactics involved in this case, the psychological impact, and ways to combat it. Before we know more about the case, let’s have a look at deep fake, AI, and Sextortion and how fraudsters use technology to commit crimes.
Understanding Deepfake
Deepfake technology is the manipulation or fabrication of multimedia information such as videos, photos, or audio recordings using artificial intelligence (AI) algorithms, and profound learning models. These algorithms process massive quantities of data to learn and imitate human-like behaviour, allowing for very realistic synthetic media development.
Individuals with malicious intent may change facial expressions, bodily movements, and even voices in recordings using deepfake technology, basically replacing a person’s appearance with someone else’s. The produced film can be practically indistinguishable from authentic footage, making it difficult for viewers to distinguish between the two.
Sextortion and technology
Sextortion is a sort of internet blackmail in which offenders use graphic or compromising content to compel others into offering money, sexual favours, or other concessions. This information is usually gained by hacking, social engineering, or tricking people into providing sensitive information.
Deepfake technology combined with sextortion techniques has increased the impact on victims. Deepfakes may now be used by perpetrators to make and distribute pornographic or compromising movies or photographs that seem genuine but are completely fake. As the prospect of discovery grows increasingly credible and tougher to rebut, the stakes for victims rise.
Cyber crooks Deceive
In this present case, cyber thugs first make video calls to people and capture the footage. They then twist the footage and merge it with a distorted naked video. As a result, the victim is obliged to conceal the case. Following that, “they demand money as a ransom to stop releasing the doctored video on the victim’s contacts and social media platforms.” In this case, a video has emerged in which a lady who was supposedly featured in the first film is depicted committing herself because of the shame caused by the video’s release. These extra threats are merely intended to inflict psychological pressure and coercion on the victims.
Sextortionists have reached a new low by profiting from the misfortunes of others, notably targeting deceased victims. The offenders want to maximise emotional pain and persuade the victim into acquiescence by generating deep fake films depicting these persons. They use the inherent compassion and emotion connected with tragedy to exact bigger ransoms from their victims.
This distressing exploitation not only adds urgency to the extortion demands but also preys on the victim’s sensitivity and emotional instability. They even pressurize the victim by impersonating them, and if the demands are fulfilled, the victims may land up in jail.
Tactics used
The morphed death videos are precisely constructed to heighten emotional discomfort and instil terror in the targeted individual. By editing photographs or videos of the deceased, the offenders create unsettling circumstances that heighten the victim’s emotional response.
The psychological manipulation seeks to instil guilt, regret, and a sense of responsibility in the victim. The notion that they are somehow linked to the catastrophe increases their emotional weakness, making them more vulnerable to the demands of sextortionists. The offenders take use of these emotions, coercing victims into cooperation out of fear of being involved in the apparent tragedy.
The impact on the victim’s mental well-being cannot be overstated. They may experience intense psychological trauma, including anxiety, depression, and post-traumatic stress disorder (PTSD). The guilt and shame associated with the false belief of being linked to someone’s death can have long-lasting effects on their emotional health and overall quality of life, others may have trust issues.
Law enforcement agencies advised
Law enforcement organisations were concerned about the growing annoyance of these illegal acts. The use of deep fake methods or other AI technologies to make convincing morphing films demonstrates scammers’ improved ability. These tools are fully capable of modifying digital information in ways that are radically different from the genuine film, making it difficult for victims to detect the fake nature of the video.
Defence strategies to fight back: To combat sextortion, a proactive approach that empowers individuals and utilizes resources is required. This section delves into crucial anti-sextortion techniques such as reporting events, preserving evidence, raising awareness, and implementing digital security measures.
- Report the Incident: Sextortion victims should immediately notify law enforcement. Contact your local police or cybercrime department and supply them with any important information, including specifics of the extortion attempt, communication logs, and any other evidence that can assist in the investigation. Reporting the occurrence is critical for keeping criminals responsible and averting additional harm to others.
- Preserve Evidence: Preserving evidence is critical in creating a solid case against sextortionists. Save and document any types of contact connected to the extortion, including text messages, emails, and social media conversations. Take screenshots, record phone calls (if legal), and save any other digital material or papers that might be used as evidence. This evidence can be useful in investigations and judicial processes.
Digital security: Implementing comprehensive digital security measures can considerably lower the vulnerability to sextortion assaults. Some important measures that one can use:
- Use unique, complicated passwords for all online accounts, and avoid reusing passwords across platforms. Consider utilising password managers to securely store and create strong passwords.
- Enable two-factor authentication (2FA) whenever possible, which adds an extra layer of protection by requiring a second verification step, such as a code delivered to your phone or email, in addition to the password.
- Regular software updates: Keep your operating system, antivirus software, and programmes up to date. Security patches are frequently included in software upgrades to defend against known vulnerabilities.
- Adjust your privacy settings on social networking platforms and other online accounts to limit the availability of personal information and restrict access to your content.
- Be cautious when clicking on links or downloading files from unfamiliar or suspect sources. When exchanging personal information online, only use trusted websites.
Conclusion:
Combating sextortion demands a collaborative effort that combines proactive tactics and resources to confront this damaging practice. Individuals may actively fight back against sextortion by reporting incidences, preserving evidence, raising awareness, and implementing digital security measures. It is critical to empower victims, encourage their rehabilitation, and collaborate to build a safer online environment where sextortionists are held accountable and everyone can navigate the digital environment with confidence.