#FactCheck - "Deepfake Video Falsely Claims of Elon Musk conducting give away for Cryptocurrency”
Executive Summary:
A viral online video claims Billionaire and Founder of Tesla & SpaceX Elon Musk of promoting Cryptocurrency. The CyberPeace Research Team has confirmed that the video is a deepfake, created using AI technology to manipulate Elon’s facial expressions and voice through the use of relevant, reputed and well verified AI tools and applications to arrive at the above conclusion for the same. The original footage had no connections to any cryptocurrency, BTC or ETH apportion to the ardent followers of crypto-trading. The claim that Mr. Musk endorses the same and is therefore concluded to be false and misleading.

Claims:
A viral video falsely claims that Billionaire and founder of Tesla Elon Musk is endorsing a Crypto giveaway project for the crypto enthusiasts which are also his followers by consigning a portion of his valuable Bitcoin and Ethereum stock.


Fact Check:
Upon receiving the viral posts, we conducted a Google Lens search on the keyframes of the video. The search led us to various legitimate sources featuring Mr. Elon Musk but none of them included any promotion of any cryptocurrency giveaway. The viral video exhibited signs of digital manipulation, prompting a deeper investigation.
We used AI detection tools, such as TrueMedia.org, to analyze the video. The analysis confirmed with 99.0% confidence that the video was a deepfake. The tools identified "substantial evidence of manipulation," particularly in the facial movements and voice, which were found to be artificially generated.



Additionally, an extensive review of official statements and interviews with Mr. Musk revealed no mention of any such giveaway. No credible reports were found linking Elon Musk to this promotion, further confirming the video’s inauthenticity.
Conclusion:
The viral video claiming that Elon Musk promotes a crypto giveaway is a deep fake. The research using various tools such as Google Lens, AI detection tool confirms that the video is manipulated using AI technology. Additionally, there is no information in any official sources. Thus, the CyberPeace Research Team confirms that the video was manipulated using AI technology, making the claim false and misleading.
- Claim: Elon Musk conducting giving away Cryptocurrency viral on social media.
- Claimed on: X(Formerly Twitter)
- Fact Check: False & Misleading
Related Blogs

Introduction
In the age of advanced technology, Cyber threats continue to grow, and so are the cyber hubs. A new name has been added to the cyber hub, Purnia, a city in India, is now evolving as a new and alarming menace-biometric cloning and financial crimes. This emerging cyber threat involves replicating an individual’s biometric data, such as fingerprint or facial recognition, to gain unauthorised access to their bank accounts and carry out fraudulent activities. In this blog, we will have a look at the methods employed, the impact on individuals and institutions, and the necessary steps to mitigate the risk.
The Backdrop
Purnia, a bustling city in the state of Bihar, India, is known for its rich cultural heritage, However, underneath its bright appearance comes a hidden danger—a rising cyber threat with the potential to devastate its citizens’ financial security. Purnia has seen the growth of a dangerous trend in recent years, such as biometric cloning for financial crimes, after several FIRs were registered with Kasba and Amaur police stations. The Police came into action and started an investigation.
Modus Operandi unveiled
The modus Operandi of cyber criminals includes hacking into databases, intercepting data during transactions, or even physically obtaining fingerprints of facial images from objects or surfaces. Let’s understand how they gathered all this data and why Bihar was not targeted.
These criminals are way smart they operate in the three states. They targeted and have open access to obtain registry and agreement paperwork from official websites, albeit it is not available online in Bihar. As a result, the scam was conducted in other states rather than Bihar; further, the fraudsters were involved in downloading the fingerprints, biometrics, and Aadhaar numbers of buyers and sellers from the property registration documents of Andhra Pradesh, Haryana, and Telangana.
After Cloning fingerprints, the fraudster withdrew money after linking with Aadhaar Enabled Payment System (AEPS) from various bank accounts. The fraudsters stamped the fingerprint on rubber trace paper and utilised a polymer stamp machine and heating at a specific temperature with a chemical to make duplicate fingerprints used in unlawful financial transactions from several consumers’ bank accounts.
Investigation Insight
After the breakthrough, the police teams recovered a large number of smartphones, ATM cards, rubber stamps of fingerprints, Aadhar numbers, scanners, Stamp machines, laptops, and chemicals, and along with this, 17 people were arrested.
During the investigation, it was found that the cybercriminals employ Sophisticated money laundering techniques to obscure the illicit origins of the stolen funds. The fraudsters transfer money into various /multiple accounts or use cryptocurrency. Using these tactics makes it more challenging for authorities to trace back money and get it back.
Impact of biometric Cloning scam
The Biometric scam has far-reaching implications both for society, Individuals, and institutions. These kinds of scams cause financial losses and create emotional breakdowns, including anger, anxiety, and a sense of violation. This also broke the trust in a digital system.
It also seriously impacts institutions. Biometric cloning frauds may potentially cause severe reputational harm to financial institutions and organisations. When clients fall prey to such frauds, it erodes faith in the institution’s security procedures, potentially leading to customer loss and a tarnished reputation. Institutions may suffer legal and regulatory consequences, and they must invest money in investigating the incident, paying victims, and improving their security systems to prevent similar instances.
Raising Awareness
Empowering Purnia Residents to Protect Themselves from Biometric Fraud: Purnia must provide its inhabitants with knowledge and techniques to protect their personal information as it deals with the increasing issue of biometric fraud. Individuals may defend themselves from falling prey to these frauds by increasing awareness about biometric fraud and encouraging recommended practices. This blog will discuss the necessity of increasing awareness and present practical recommendations to help Purnia prevent biometric fraud. Here are some tips that one can follow;
- Securing personal Biometric data: It is crucial to safeguard personal biometric data. Individuals should be urged to secure their fingerprints, face scans, and other biometric information in the same way that they protect their passwords or PINs. It is critical to ensure that biometric data is safely maintained and shared with only trustworthy organisations with strong security procedures in place.
- Verifying Service providers: Residents should be vigilant while submitting biometric data to service providers, particularly those providing financial services. Before disclosing any sensitive information, it is important to undertake due diligence and establish the validity and reliability of the organisation. Checking for relevant certificates, reading reviews, and getting recommendations can assist people in making educated judgments and avoiding unscrupulous companies.
- Personal Cybersecurity: Individuals should implement robust cybersecurity practices to reduce the danger of biometric fraud. This includes using difficult and unique passwords, activating two-factor authentication, upgrading software and programs on a regular basis, and being wary of phishing efforts. Individuals should also refrain from providing personal information or biometric data via unprotected networks or through untrustworthy sources.
- Educating the Elderly and Vulnerable Groups: Special attention should be given to educating the elderly and other vulnerable groups who may be more prone to scams. Awareness campaigns may be modified to their individual requirements, emphasising the significance of digital identities, recognising possible risks, and seeking help from reliable sources when in doubt. Empowering these populations with knowledge can help keep them safe from biometric fraud.
Measures to Stay Ahead
As biometric fraud is a growing concern, staying a step ahead is essential. By following these simple steps, one can safeguard themselves.
- Multi-factor Authentication: MFA is one of the best methods for security. MFA creates multi-layer security or extra-layer security against unauthorised access. MFA incorporates a biometric scan and a password.
- Biometric Encryption: Biometric encryption securely stores and transmits biometric data. Rather than keeping raw biometric data, encryption methods transform it into mathematical templates that cannot be reverse-engineered. These templates are utilised for authentication, guaranteeing that the original biometric information is not compromised even if the encrypted data is.
- AI and Machine Learning (ML): AI and ML technologies are critical in detecting and combating biometric fraud. These systems can analyse massive volumes of data in real-time, discover trends, and detect abnormalities. Biometric systems may continually adapt and enhance accuracy by employing AI and ML algorithms, boosting their capacity to distinguish between legitimate users and fraudulent efforts.
Conclusion
The Biometric fraud call needs immediate attention to protect the bankers from the potential consequences. By creating awareness, we can save ourselves; additionally, by working together, we can create a safer digital environment. The use of biometric verification was inculcated to increase factor authentication for a banker. However, we see that the bad actors have already started to bypass the tech and even wreak havoc upon the netizens by draining their accounts of their hard-earned money. The banks and the cyber cells nationwide need to work together in synergy to increase awareness and safety mechanisms to prevent such cyber crimes and create effective and efficient redressal mechanisms for the citizens.
Reference

"Cybercriminals are unleashing a surprisingly high volume of new threats in this short period of time to take advantage of inadvertent security gaps as organizations are in a rush to ensure business continuity.”
Cyber security firm Fortinet on Monday announced that over the past several weeks, it has been monitoring a significant spike in COVID-19 related threats.
An unprecedented number of unprotected users and devices are now online with one or two people in every home connecting remotely to work through the internet. Simultaneously there are children at home engaged in remote learning and the entire family is engaged in multi-player games, chatting with friends as well as streaming music and video. The cybersec firm’s FortiGuard Labs is observing this perfect storm of opportunity being exploited by cybercriminals as the Threat Report on the Pandemic highlights:
A surge in Phishing Attacks: The research shows an average of about 600 new phishing campaigns every day. The content is designed to either prey on the fears and concerns of individuals or pretend to provide essential information on the current pandemic. The phishing attacks range from scams related to helping individuals deposit their stimulus for Covid-19 tests, to providing access to Chloroquine and other medicines or medical device, to providing helpdesk support for new teleworkers.
Phishing Scams Are Just the Start: While the attacks start with a phishing attack, their end goal is to steal personal information or even target businesses through teleworkers. Majority of the phishing attacks contain malicious payloads – including ransomware, viruses, remote access trojans (RATs) designed to provide criminals with remote access to endpoint systems, and even RDP (remote desktop protocol) exploits.
A Sudden Spike in Viruses: The first quarter of 2020 has documented a 17% increase in viruses for January, a 52% increase for February and an alarming 131% increase for March compared to the same period in 2019. The significant rise in viruses is mainly attributed to malicious phishing attachments. Multiple sites that are illegally streaming movies that were still in theatres secretly infect malware to anyone who logs on. Free game, free movie, and the attacker is on your network.
Risks for IoT Devices magnify: As users are all connected to the home network, attackers have multiple avenues of attack that can be exploited targeting devices including computers, tablets, gaming and entertainment systems and even online IoT devices such as digital cameras, smart appliances – with the ultimate goal of finding a way back into a corporate network and its valuable digital resources.
Ransomware like attack to disrupt business: If the device of a remote worker can be compromised, it can become a conduit back into the organization’s core network, enabling the spread of malware to other remote workers. The resulting business disruption can be just as effective as ransomware targeting internal network systems for taking a business offline. Since helpdesks are now remote, devices infected with ransomware or a virus can incapacitate workers for days while devices are mailed in for reimaging.
“Though organizations have completed the initial phase of transitioning their entire workforce to remote telework and employees are becoming increasingly comfortable with their new reality, CISOs continue to face new challenges presented by maintaining a secure teleworker business model. From redefining their security baseline, or supporting technology enablement for remote workers, to developing detailed policies for employees to have access to data, organizations must be nimble and adapt quickly to overcome these new problems that are arising”, said Derek Manky, Chief, Security Insights & Global Threat Alliances at Fortinet – Office of CISO.

Introduction
Assisted Reproductive Technology (“ART”) refers to a diverse set of medical procedures designed to aid individuals or couples in achieving pregnancy when conventional methods are unsuccessful. This umbrella term encompasses various fertility treatments, including in vitro fertilization (IVF), intrauterine insemination (IUI), and gamete and embryo manipulation. ART procedures involve the manipulation of both male and female reproductive components to facilitate conception.
The dynamic landscape of data flows within the healthcare sector, notably in the realm of ART, demands a nuanced understanding of the complex interplay between privacy regulations and medical practices. In this context, the Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011, play a pivotal role, designating health information as "sensitive personal data or information" and underscoring the importance of safeguarding individuals' privacy. This sensitivity is particularly pronounced in the ART sector, where an array of personal data, ranging from medical records to genetic information, is collected and processed. The recent Assisted Reproductive Technology (Regulation) Act, 2021, in conjunction with the Digital Personal Data Protection Act, 2023, establishes a framework for the regulation of ART clinics and banks, presenting a layered approach to data protection.
A note on data generated by ART
Data flows in any sector are scarcely uniform and often not easily classified under straight-jacket categories. Consequently, mapping and identifying data and its types become pivotal. It is believed that most data flows in the healthcare sector are highly sensitive and personal in nature, which may severely compromise the privacy and safety of an individual if breached. The Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011 (“SPDI Rules”) categorizes any information pertaining to physical, physiological, mental conditions or medical records and history as “sensitive personal data or information”; this definition is broad enough to encompass any data collected by any ART facility or equipment. These include any information collected during the screening of patients, pertaining to ovulation and menstrual cycles, follicle and sperm count, ultrasound results, blood work etc. It also includes pre-implantation genetic testing on embryos to detect any genetic abnormality.
But data flows extend beyond mere medical procedures and technology. Health data also involves any medical procedures undertaken, the amount of medicine and drugs administered during any procedure, its resultant side effects, recovery etc. Any processing of the above-mentioned information, in turn, may generate more personal data points relating to an individual’s political affiliations, race, ethnicity, genetic data such as biometrics and DNA etc.; It is seen that different ethnicities and races react differently to the same/similar medication and have different propensities to genetic diseases. Further, it is to be noted that data is not only collected by professionals but also by intelligent equipment like AI which may be employed by any facility to render their service. Additionally, dissemination of information under exceptional circumstances (e.g. medical emergency) also affects how data may be classified. Considerations are further nuanced when the fundamental right to identity of a child conceived and born via ART may be in conflict with the fundamental right to privacy of a donor to remain anonymous.
Intersection of Privacy laws and ART laws:
In India, ART technology is regulated by the Assisted Reproductive Technology (Regulation) Act, 2021 (“ART Act”). With this, the Union aims to regulate and supervise assisted reproductive technology clinics and ART banks, prevent misuse and ensure safe and ethical practice of assisted reproductive technology services. When read with the Digital Personal Data Protection Act, 2023 (“DPDP Act”) and other ancillary guidelines, the two legislations provide some framework regulations for the digital privacy of health-based apps.
The ART Act establishes a National Assisted Reproductive Technology and Surrogacy Registry (“National Registry”) which acts as a central database for all clinics and banks and their nature of services. The Act also establishes a National Assisted Reproductive Technology and Surrogacy Board (“National Board”) under the Surrogacy Act to monitor the implementation of the act and advise the central government on policy matters. It also supervises the functioning of the National Registry, liaises with State Boards and curates a code of conduct for professionals working in ART clinics and banks. Under the DPDP Act, these bodies (i.e. National Board, State Board, ART clinics and banks) are most likely classified as data fiduciaries (primarily clinics and banks), data processors (these may include National Board and State boards) or an amalgamation of both (these include any appropriate authority established under the ART Act for investigation of complaints, suspend or cancellation of registration of clinics etc.) depending on the nature of work undertaken by them. If so classified, then the duties and liabilities of data fiduciaries and processors would necessarily apply to these bodies. As a result, all bodies would necessarily have to adopt Privacy Enhancing Technologies (PETs) and other organizational measures to ensure compliance with privacy laws in place. This may be considered one of the most critical considerations of any ART facility since any data collected by them would be sensitive personal data pertaining to health, regulated by the Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011 (“SPDI Rules 2011”). These rules provide for how sensitive personal data or information are to be collected, handled and processed by anyone.
The ART Act independently also provides for the duties of ART clinics and banks in the country. ART clinics and banks are required to inform the commissioning couple/woman of all procedures undertaken and all costs, risks, advantages, and side effects of their selected procedure. It mandatorily ensures that all information collected by such clinics and banks to not informed to anyone except the database established by the National Registry or in cases of medical emergency or on order of court. Data collected by clinics and banks (these include details on donor oocytes, sperm or embryos used or unused) are required to be detailed and must be submitted to the National Registry online. ART banks are also required to collect personal information of donors including name, Aadhar number, address and any other details. By mandating online submission, the ART Act is harmonized with the DPDP Act, which regulates all digital personal data and emphasises free, informed consent.
Conclusion
With the increase in active opt-ins for ART, data privacy becomes a vital consideration for all healthcare facilities and professionals. Safeguard measures are not only required on a corporate level but also on a governmental level. It is to be noted that in the 262 Session of the Rajya Sabha, the Ministry of Electronics and Information Technology reported 165 data breach incidents involving citizen data from January 2018 to October 2023 from the Central Identities Data Repository despite publicly denying. This discovery puts into question the safety and integrity of data that may be submitted to the National Registry database, especially given the type of data (both personal and sensitive information) it aims to collate. At present the ART Act is well supported by the DPDP Act. However, further judicial and legislative deliberations are required to effectively regulate and balance the interests of all stakeholders.
References
- The Information Technology (Reasonable Security Practices And Procedures And Sensitive Personal Data Or Information) Rules, 2011
- Caring for Intimate Data in Fertility Technologies https://dl.acm.org/doi/pdf/10.1145/3411764.3445132
- Digital Personal Data Protection Act, 2023
- https://www.wolterskluwer.com/en/expert-insights/pharmacogenomics-and-race-can-heritage-affect-drug-disposition