#FactCheck - Viral Videos of Mutated Animals Debunked as AI-Generated
Executive Summary:
Several videos claiming to show bizarre, mutated animals with features such as seal's body and cow's head have gone viral on social media. Upon thorough investigation, these claims were debunked and found to be false. No credible source of such creatures was found and closer examination revealed anomalies typical of AI-generated content, such as unnatural leg movements, unnatural head movements and joined shoes of spectators. AI material detectors confirmed the artificial nature of these videos. Further, digital creators were found posting similar fabricated videos. Thus, these viral videos are conclusively identified as AI-generated and not real depictions of mutated animals.

Claims:
Viral videos show sea creatures with the head of a cow and the head of a Tiger.



Fact Check:
On receiving several videos of bizarre mutated animals, we searched for credible sources that have been covered in the news but found none. We then thoroughly watched the video and found certain anomalies that are generally seen in AI manipulated images.



Taking a cue from this, we checked all the videos in the AI video detection tool named TrueMedia, The detection tool found the audio of the video to be AI-generated. We divided the video into keyframes, the detection found the depicting image to be AI-generated.


In the same way, we investigated the second video. We analyzed the video and then divided the video into keyframes and analyzed it with an AI-Detection tool named True Media.

It was found to be suspicious and so we analyzed the frame of the video.

The detection tool found it to be AI-generated, so we are certain with the fact that the video is AI manipulated. We analyzed the final third video and found it to be suspicious by the detection tool.


The detection tool found the frame of the video to be A.I. manipulated from which it is certain that the video is A.I. manipulated. Hence, the claim made in all the 3 videos is misleading and fake.
Conclusion:
The viral videos claiming to show mutated animals with features like seal's body and cow's head are AI-generated and not real. A thorough investigation by the CyberPeace Research Team found multiple anomalies in AI-generated content and AI-content detectors confirmed the manipulation of A.I. fabrication. Therefore, the claims made in these videos are false.
- Claim: Viral videos show sea creatures with the head of a cow, the head of a Tiger, head of a bull.
- Claimed on: YouTube
- Fact Check: Fake & Misleading
Related Blogs

Introduction
Misinformation is rampant all over the world and impacting people at large. In 2023, UNESCO commissioned a survey on the impact of Fake News which was conducted by IPSOS. This survey was conducted in 16 countries that are to hold national elections in 2024 with a total of 2.5 billion voters and showed how pressing the need for effective regulation had become and found that 85% of people are apprehensive about the repercussions of online disinformation or misinformation. UNESCO has introduced a plan to regulate social media platforms in light of these worries, as they have become major sources of misinformation and hate speech online. This action plan is supported by the worldwide opinion survey, highlighting the urgent need for strong actions. The action plan outlines the fundamental principles that must be respected and concrete measures to be implemented by all stakeholders associated, i.e., government, regulators, civil society and the platforms themselves.
The Key Areas in Focus of the Action Plan
The focus area of the action plan is on the protection of the Freedom of Expression while also including access to information and other human rights in digital platform governance. The action plan works on the basic premise that the impact on human rights becomes the compass for all decision-making, at every stage and by every stakeholder. Groups of independent regulators work in close coordination as part of a wider network, to prevent digital companies from taking advantage of disparities between national regulations. Moderation of content as a feasible and effective option at the required scale, in all regions and all languages.
The algorithms of these online platforms, particularly the social media platforms are established, but it is too often geared towards maximizing engagement rather than the reliability of information. Platforms are required to take on more initiative to educate and train users to be critical thinkers and not just hopers. Regulators and platforms are in a position to take strong measures during particularly sensitive conditions ranging from elections to crises, particularly the information overload that is taking place.
Key Principles of the Action Plan
- Human Rights Due Diligence: Platforms are required to assess their impact on human rights, including gender and cultural dimensions, and to implement risk mitigation measures. This would ensure that the platforms are responsible for educating users about their rights.
- Adherence to International Human Rights Standards: Platforms must align their design, content moderation, and curation with international human rights standards. This includes ensuring non-discrimination, supporting cultural diversity, and protecting human moderators.
- Transparency and Openness: Platforms are expected to operate transparently, with clear, understandable, and auditable policies. This includes being open about the tools and algorithms used for content moderation and the results they produce.
- User Access to Information: Platforms should provide accessible information that enables users to make informed decisions.
- Accountability: Platforms must be accountable to their stakeholders which would include the users and the public, which would ensure that redressal for content-related decisions is not compromised. This accountability extends to the implementation of their terms of service and content policies.
Enabling Environment for the application of the UNESCO Plan
The UNESCO Action Plan to counter misinformation has been created to create an environment where freedom of expression and access to information flourish, all while ensuring safety and security for digital platform users and non-users. This endeavour calls for collective action—societies as a whole must work together. Relevant stakeholders, from vulnerable groups to journalists and artists, enable the right to expression.
Conclusion
The UNESCO Action Plan is a response to the dilemma that has been created due to the information overload, particularly, because the distinction between information and misinformation has been so clouded. The IPSOS survey has revealed the need for an urgency to address these challenges in the users who fear the repercussions of misinformation.
The UNESCO action plan provides a comprehensive framework that emphasises the protection of human rights, particularly freedom of expression, while also emphasizing the importance of transparency, accountability, and education in the governance of digital platforms as a priority. By advocating for independent regulators and encouraging platforms to align with international human rights standards, UNESCO is setting the stage for a more responsible and ethical digital ecosystem.
The recommendations include integrating regulators through collaborations and promoting global cooperation to harmonize regulations, expanding the Digital Literacy campaign to educate users about misinformation risks and online rights, ensuring inclusive access to diverse content in multiple languages and contexts, and monitoring and refining tech advancements and regulatory strategies as challenges evolve. To ultimately promote a true online information landscape.
Reference
- https://www.unesco.org/en/articles/online-disinformation-unesco-unveils-action-plan-regulate-social-media-platforms
- https://www.unesco.org/sites/default/files/medias/fichiers/2023/11/unesco_ipsos_survey.pdf
- https://dig.watch/updates/unesco-sets-out-strategy-to-tackle-misinformation-after-ipsos-survey

Introduction
Summer vacations have always been one of the most anticipated times in a child’s life. In earlier times, it was something entirely different. The season was filled with outdoor games, muddy hands, mango-stained mouths, and stories shared with cousins under the stars. Children lived in the moment, playing in parks, riding bicycles, and inventing new adventures without a screen in sight. Today, those same summer days are shaped by glowing devices, virtual games, and hours spent online. While technology brings learning and entertainment, it also invites risks that parents cannot ignore. The Cyber Mom Toolkit is here to help you navigate this shift, offering simple and thoughtful ways to keep your children safe, balanced, and joyful during these screen filled holidays.
The Hidden Cyber Risks of Summer Break
With increased leisure time and less supervision, children are likely to venture into unknown reaches of the internet. I4C reports indicate that child-related cases, such as cyberbullying, sextortion, and viewing offensive content, surge during school vacations. Gaming applications, social networking applications, and YouTube can serve as entry points for cyber predators and spammers. That's why it is important that parents, particularly mothers know what digital spaces their children live in and how to intervene appropriately.
Your Action Plan for Being a Cyber Smart Mom
Moms Need to Get Digitally Engaged
You do not need to be a tech expert to become a cyber smart mom. With just a few simple digital skills, you can start protecting your child online with confidence and ease.
1. Know the Platforms Your Children Use
Spend some time investigating apps such as Instagram, Snapchat, Discord, YouTube, or computer games like Roblox and Minecraft. Familiarise yourself with the type of content, chat options, and privacy loopholes they may have.
2. Install Parental Controls
Make use of native features on devices (Android, iOS, Windows) to limit screen time, block mature content, and track downloads. Applications such as Google Family Link and Apple Screen Time enable parents to control apps and web browsing.
3. Develop a Family Cyber Agreement
- Establish common rules such as:
- No devices in bedrooms past 9 p.m.
- Add only safe connections on social media.
- Don't open suspicious messages or click on mysterious links.
- Always tell your mom if something makes you feel uncomfortable online.
Talk Openly and Often
Kids tend to hide things online because they don't want to get punished or embarrassed. Trust is built better than monitoring. Here's how:
- Have non-judgmental chats about what they do online.
- Use news reports or real-life cases as conversation starters: "Did you hear about that YouTuber's hacked account?
- Encourage them to question things if they're confused or frightened.
- Honour their online life as a legitimate aspect of their lives.
Look for the Signs of Online Trouble
Stay alert to subtle changes in your child’s behavior, as they can be early signs of trouble in their online world.
- Sudden secrecy or aggression when questioned about online activity.
- Overuse of screens, particularly in the evening.
- Deterioration in school work or interest in leisure activities.
- Mood swings, anxiety, or withdrawn behaviour.
If you notice these, speak to your child calmly. You can also report serious matters such as cyberbullying or blackmail on the Cybercrime Helpline 1930 or visit https://cybercrime.gov.in
Support Healthy Digital Behaviours
Teach your kids to be good netizens by leading them to:
- Reflect Before Posting: No address, school name, or family information should ever appear in public posts.
- Set Strong Passwords: Passwords must be long, complicated, and not disclosed to friends, even best friends.
- Enable Privacy Settings: Keep social media accounts privately. Disable location sharing. Restrict comments and messages from others.
- Vigilance: Encourage them to spot fake news, scams, and manipulative ads. Critical thinking is the ultimate defence.
Stay alert to subtle changes in your child’s behavior, as they can be early signs of trouble in their online world.
Where to Learn More and Get Support as a Cyber Mom
Cyber moms looking to deepen their understanding of online safety can explore a range of helpful resources offered by CyberPeace. Our blog features easy-to-understand articles on current cyber threats, safety tips, and parenting guidance for the digital age. You can also follow our social media pages for regular updates, quick tips, and awareness campaigns designed especially for families. If you ever feel concerned or need help, the CyberPeace Helpline is available to offer support and guidance. (+91 9570000066 or write to us at helpline@cyberpeace.net). For those who want to get more involved, joining the CyberPeace Corps allows you to become part of a larger community working to promote digital safety and cyber awareness across the country.
Empowering Mothers Empowers Society
We at CyberPeace feel that every mother, irrespective of her background and technological expertise, has the potential to be a Cyber Mom. The intention is not to control the child but to mentor towards safer decisions, identify issues early, and prepare them for a lifetime of online responsibility. Mothers are empowered when they know. And children are safe when they are protected.
Conclusion
The web isn't disappearing, and neither are its dangers. But when mothers are digital role models, they can make summer screen time a season of wise decisions. This summer, become a Cyber Mom: someone who learns, leads, and listens. Whether it's installing a parental control app, discussing openly about cyberbullying, or just asking your child, "What did you discover online today? " that engagement can make a difference. This summer break, help your child become digitally equipped with the skills and knowledge they need to navigate the online world safely and confidently.
Cyber safety starts at home, and there's no better point of departure than being alongside your child, rather than behind them.
References
- https://cybercrime.gov.in
- https://support.apple.com/en-in/HT208982
- https://beinternetawesome.withgoogle.com
- https://www.cyberpeace.org
- https://ncpcr.gov.in

Introduction
Indian Cybercrime Coordination Centre (I4C) was established by the Ministry of Home Affairs (MHA) to provide a framework and eco-system for law enforcement agencies (LEAs) to deal with cybercrime in a coordinated and comprehensive manner. The Indian Ministry of Home Affairs approved a scheme for the establishment of the Indian Cyber Crime Coordination Centre (I4C) in October2018, which was inaugurated by Home Minister Amit Shah in January 2020. I4C is envisaged to act as the nodal point to curb Cybercrime in the country. Recently, on 13th March2024, the Centre designated the Indian Cyber Crime Coordination Centre (I4C) as an agency of the Ministry of Home Affairs (MHA) to perform the functions under the Information Technology Act, 2000, to inform about unlawful cyber activities.
The gazetted notification dated 13th March 2024 read as follows:
“In exercise of the powers conferred by clause (b) of sub-section (3) of section 79 of the Information Technology Act 2000, Central Government being the appropriate government hereby designate the Indian Cybercrime Coordination Centre (I4C), to be the agency of the Ministry of Home Affairs to perform the functions under clause (b) of sub-section (3) of section79 of Information Technology Act, 2000 and to notify the instances of information, data or communication link residing in or connected to a computer resource controlled by the intermediary being used to commit the unlawful act.”
Impact
Now, the Indian Cyber Crime Coordination Centre (I4C) is empowered to issue direct takedown orders under 79(b)(3) of the IT Act, 2000. Any information, data or communication link residing in or connected to a computer resource controlled by any intermediary being used to commit unlawful acts can be notified by the I4C to the intermediary. If an intermediary fails to expeditiously remove or disable access to a material after being notified, it will no longer be eligible for protection under Section 79 of the IT Act, 2000.
Safe Harbour Provision
Section79 of the IT Act also serves as a safe harbour provision for the Intermediaries. The safe harbour provision under Section 79 of the IT Act states that "an intermediary shall not be liable for any third-party information, data, or communication link made available or hosted by him". However, it is notable that this legal immunity cannot be granted if the intermediary "fails to expeditiously" take down a post or remove a particular content after the government or its agencies flag that the information is being used to commit something unlawful. Furthermore, Intermediaries are also obliged to perform due diligence on their platforms and comply with the rules & regulations and maintain and promote a safe digital environment on the respective platforms.
Under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, The government has also mandated that a ‘significant social media intermediary’ must appoint a Chief Compliance Officer (CCO), Resident Grievance Officer (RGO), and Nodal Contact Person and publish periodic compliance report every month mentioning the details of complaints received and action taken thereon.
I4C's Role in Safeguarding Cyberspace
The Indian Cyber Crime Coordination Centre (I4C) is actively working towards initiatives to combat the emerging threats in cyberspace. I4C is one of the crucial extensions of the Ministry of Home Affairs, Government of India, working extensively to combat cyber crimes and ensure the overall safety of netizens. The ‘National Cyber Crime Reporting Portal’ equipped with a 24x7 helpline number 1930, is one of the key component of the I4C.
Components Of The I4C
- National Cyber Crime Threat Analytics Unit
- National Cyber Crime Reporting Portal
- National Cyber Crime Training Centre
- Cyber Crime Ecosystem Management Unit
- National Cyber Crime Research and Innovation Centre
- National Cyber Crime Forensic Laboratory Ecosystem
- Platform for Joint Cyber Crime Investigation Team.
Conclusion
I4C, through its initiatives and collaborative efforts, plays a pivotal role in safeguarding cyberspace and ensuring the safety of netizens. I4C reinforces India's commitment to combatting cybercrime and promoting a secure digital environment. The recent development by designating the I4C as an agency to notify the instances of unlawful activities in cyberspace serves as a significant step to counter cybercrime and promote an ethical and safe digital environment for netizens.
References
- https://www.deccanherald.com/india/centre-designates-i4c-as-agency-of-mha-to-notify-unlawful-activities-in-cyber-world-2936976
- https://www.business-standard.com/india-news/home-ministry-authorises-i4c-to-issue-takedown-notices-under-it-act-124031500844_1.html
- https://www.hindustantimes.com/india-news/it-ministry-empowers-i4c-to-notify-instances-of-cybercrime-101710443217873.html
- https://i4c.mha.gov.in/about.aspx#:~:text=Objectives%20of%20I4C,identifying%20Cybercrime%20trends%20and%20patterns