#FactCheck - AI Artwork Misattributed: Mahendra Singh Dhoni Sand Sculptures Exposed as AI-Generated
Executive Summary:
A recent claim going around on social media that a child created sand sculptures of cricket legend Mahendra Singh Dhoni, has been proven false by the CyberPeace Research Team. The team discovered that the images were actually produced using an AI tool. Evident from the unusual details like extra fingers and unnatural characteristics in the sculptures, the Research Team discerned the likelihood of artificial creation. This suspicion was further substantiated by AI detection tools. This incident underscores the need to fact-check information before posting, as misinformation can quickly go viral on social media. It is advised everyone to carefully assess content to stop the spread of false information.

Claims:
The claim is that the photographs published on social media show sand sculptures of cricketer Mahendra Singh Dhoni made by a child.




Fact Check:
Upon receiving the posts, we carefully examined the images. The collage of 4 pictures has many anomalies which are the clear sign of AI generated images.

In the first image the left hand of the sand sculpture has 6 fingers and in the word INDIA, ‘A’ is not properly aligned i.e not in the same line as other letters. In the second image, the finger of the boy is missing and the sand sculpture has 4 fingers in its front foot and has 3 legs. In the third image the slipper of the boy is not visible whereas some part of the slipper is visible, and in the fourth image the hand of the boy is not looking like a hand. These are some of the major discrepancies clearly visible in the images.
We then checked using an AI Image detection tool named ‘Hive’ image detection, Hive detected the image as 100.0% AI generated.

We then checked it in another AI image detection named ContentAtScale AI image detection, and it found to be 98% AI generated.

From this we concluded that the Image is AI generated and has no connection with the claim made in the viral social media posts. We have also previously debunked AI Generated artwork of sand sculpture of Indian Cricketer Virat Kohli which had the same types of anomalies as those seen in this case.
Conclusion:
Taking into consideration the distortions spotted in the images and the result of AI detection tools, it can be concluded that the claim of the pictures representing the child's sand sculptures of cricketer Mahendra Singh Dhoni is false. The pictures are created with Artificial Intelligence. It is important to check and authenticate the content before posting it to social media websites.
- Claim: The frame of pictures shared on social media contains child's sand sculptures of cricket player Mahendra Singh Dhoni.
- Claimed on: X (formerly known as Twitter), Instagram, Facebook, YouTube
- Fact Check: Fake & Misleading
Related Blogs

Introduction
The development of high-speed broadband internet in the 90s triggered a growth in online gaming, particularly in East Asian countries like South Korea and China. This culminated in the proliferation of competitive video game genres, which had otherwise existed mostly in the form of high-score and face-to-face competitions at arcades. The online competitive gaming market has only become bigger over the years, with a separate domain for professional competition, called esports. This industry is projected to reach US$4.3 billion by 2029, driven by advancements in gaming technology, increased viewership, multi-million dollar tournaments, professional leagues, sponsorships, and advertising revenues. However, the industry is still in its infancy and struggles with fairness and integrity issues. It can draw lessons in regulation from the traditional sports market to address these challenges for uniform global growth.
The Growth of Esports
The appeal of online gaming lies in its design innovations, social connectivity, and accessibility. Its rising popularity has culminated in online gaming competitions becoming an industry, formally organised into leagues and tournaments with reward prizes reaching up to millions of dollars. Professional teams now have coaches, analysts and psychologists supporting their players. For scale, the 2024 ESports World Cup (EWS) held in Saudi Arabia had the largest combined prize pool of over US$60 million. Such tournaments can be viewed in arenas and streamed online, and by 2025, around 322.7 million people are forecast to be occasional viewers of esports events.
According to Statista, esports revenue is expected to demonstrate an annual growth rate (CAGR 2024-2029) of 6.59%, resulting in a projected market volume of US$5.9 billion by 2029. Esports has even been recognised in traditional sporting events, debuting as a medal sport in the Asian Games 2022. In 2024, the International Olympic Committee (IOC) announced the Olympic Esports Games, with the inaugural event set to take place in 2025 in Saudi Arabia. Hosting esports events such as the EWS is expected to boost tourism and the host country’s local economy.
The Challenges of Esports Regulation
While the esports ecosystem provides numerous opportunities for growth and partnerships, its under-regulation presents challenges. Due to the lack of a single governing body like the IOC for the Olympics or FIFA for football to lay down centralised rules, the industry faces certain challenges, such as :
- Integrity issues: Esports are not immune to cheating attempts. Match-fixing, using advanced software hacks, doping (e.g., Adderall use), and the use of other illegal aids are common. DOTA, Counter-Strike, and Overwatch tournaments are particularly susceptible to cheating scandals.
- Players’ Rights: The teams that contractually own professional players provide remuneration and exercise significant control over athletes, who face issues like overwork, a short-lived career, stress, the absence of collective bargaining forums, instability, etc.
- Fragmented National Regulations: While multiple countries have recognised esports as a sport, policies on esports governance and allied regulation vary within and across borders. For example, age restrictions and laws on gambling, taxation, labour, and advertising differ by country. This can create confusion, risks and extra costs, impacting the growth of the ecosystem.
- Cybersecurity Concerns: The esports industry carries substantial prize pools and has growing viewer engagement, which makes it increasingly vulnerable to Distributed Denial of Service (DDoS) attacks, malware, ransomware, data breaches, phishing, and account hijacking. Tournament organisers must prioritise investments in secure network infrastructure, perform regular security audits, encrypt sensitive data, implement network monitoring, utilise API penetration testing tools, deploy intrusion detection systems, and establish comprehensive incident response and mitigation plans.
Proposals for Esports Regulation: Lessons from Traditional Sports
To address the most urgent challenges to the esports industry as outlined above, the following interventions, drawing on the governance and regulatory frameworks of traditional sports, can be made:
- Need for a Centralised Esports Governing Body: Unlike traditional sports, the esports landscape lacks a Global Sports Organisation (GSO) to oversee its governance. Instead, it is handled de facto by game publishers with industry interests different from those of traditional GSOs. Publishers’ primary source of revenue is not esports, which means they can adopt policies unsuitable for its growth but good for their core business. Appointing a centralised governing body with the power to balance the interests of multiple stakeholders and manage issues like unregulated gambling, athlete health, and integrity challenges is a logical next step for this industry.
- Gambling/Betting Regulations: While national laws on gambling/betting vary, GSOs establish uniform codes of conduct that bind participants contractually, ensuring consistent ethical standards across jurisdictions. Similar rules in esports are managed by individual publishers/ tournament organisers, leading to inconsistencies and legal grey areas. The esports ecosystem needs standardised regulation to preserve fair play codes and competitive integrity.
- Anti-Doping Policies: There is increasing adderall abuse among young players to enhance performance with the rising monetary stakes in esports. The industry must establish a global framework similar to the World Anti-Doping Code, which, in conjunction with eight international standards, harmonises anti-doping policies across all traditional sports and countries in the world. The esports industry should either adopt this or develop its own policy to curb stimulant abuse.
- Norms for Participant Health: Professional players start around age 16 or 17 and tend to retire around 24. They may be subjected to rigorous practice hours and stringent contracts by the teams that own them. There is a need for international norm-setting by a federation overseeing the protection of underage players. Enforcement of these norms can be one of the responsibilities of a decentralised system comprising country and state-level bodies. This also ensures fair play governance.
- Respect and Diversity: While esports is technologically accessible, it still has room for better representation of diverse gender identities, age groups, abilities, races, ethnicities, religions, and sexual orientations. Embracing greater diversity and inclusivity would benefit the industry's growth and enhance its potential to foster social connectivity through healthy competition.
Conclusion
The development of the world’s first esports island in Abu Dhabi gives impetus to the rapidly growing esports industry with millions of fans across the globe. To sustain this momentum, stakeholders must collaborate to build a strong governance framework that protects players, supports fans, and strengthens the ecosystem. By learning from traditional sports, esports can establish centralised governance, enforce standardised anti-doping measures, safeguard athlete rights, and promote inclusivity, especially for young and diverse communities. Embracing regulation and inclusivity will not only enhance esports' credibility but also position it as a powerful platform for unity, creativity, and social connection in the digital age.
Resources
- https://www.statista.com/outlook/amo/esports/worldwide
- https://www.statista.com/statistics/490480/global-esports-audience-size-viewer-type/
- https://asoworld.com/blog/global-esports-market-report-2024/#:~:text=A%20key%20driver%20of%20this%20growth%20is%20the%20Sponsorship%20%26%20Advertising,US%24288.9%20million%20in%202024.
- https://lawschoolpolicyreview.com/2023/12/28/a-case-for-recognising-professional-esports-players-as-employees-of-their-game-publisher/
- https://levelblue.com/blogs/security-essentials/the-hidden-risks-of-esports-cybersecurity-on-the-virtual-battlefield
- https://medium.com/@heyimJoost/esports-governance-and-its-failures-9ac7b3ec37ea
- https://www.google.com/search?q=adderall+abuse+in+esports&oq=adderall+abuse+in+esports&gs_lcrp=EgZjaHJvbWUyBggAEEUYOTIHCAEQIRiPAjIHCAIQIRiPAtIBCDU2MDdqMGo5qAIAsAIB&sourceid=chrome&ie=UTF-8
- https://americanaddictioncenters.org/blog/esports-adderall-abuse#:~:text=A%202020%20piece%20by%20the,it%20because%20everyone%20was%20using

AI systems have grown in both popularity and complexity on which they operate. They are enhancing accessibility for all, including people with disabilities, by revolutionising sectors including healthcare, education, and public services. We are at the stage where AI-powered solutions that can help people with mental, physical, visual or hearing impairments perform everyday and complex tasks are being created.
Generative AI is now being used to amplify human capability. The development of tools for speech-to-text and image recognition is helping in facilitating communication and interaction for visually or hearing-impaired individuals, and smart prosthetics are providing tailored support. Unfortunately, even with these developments, PWDs have continued to face challenges. Therefore, it is important to balance innovation with ethical considerations aand ensuring that these technologies are designed with qualities like privacy, equity, and inclusivity in mind.
Access to Tech: the Barriers Faced by PWDs
PWDs face several barriers while accessing technology. Identifying these challenges is important as they lack computer accessibility, in the use of hardware and software, which has become a norm in life nowadays. Website functions that only work when users click with a mouse, self-service kiosks without accessibility features, touch screens without screen reader software or tactile keyboards, and out-of-order equipment, such as lifts, captioning mirrors and description headsets, are just some difficulties that they face in their day-to-day life.
While they are helpful, much of the current technology doesn’t fully address all disabilities. For example, many assistive devices focus on visual or mobility impairments, but they fall short of addressing cognitive or sensory conditions. In addition to this, these solutions often lack personalisation, making them less effective for individuals with diverse needs. AI has significant potential to bridge this gap. With adaptive systems like voice assistants, real-time translation, and personalised features, AI can create more inclusive solutions, improving access to both digital and physical spaces for everyone.
The Importance of Inclusive AI Design
Creating an Inclusive AI design is important. It ensures that PWDs are not excluded from technological advancements because of the impairments that they are suffering from. The concept of an ‘inclusive or universal’ design promotes creating products and services that are usable for the widest possible range of people. Tech Developers have an ethical responsibility to create advancements in AI that serve everyone. Accessibility features should be built into the core design. They should be treated as a practice rather than an afterthought. However, bias in AI development often stems from data of a non-representative nature, or assumptions can lead to systems that overlook or poorly serve PWDs. If AI algorithms are trained on limited or biased data, they risk excluding marginalised groups, making ethical, inclusive design a necessity for equity and accessibility.
Regulatory Efforts to Ensure Accessible AI
In India, the Rights of Persons with Disabilities Act of 2016 impresses upon the need to provide PWDs with equal accessibility to technology. Subsequently, the DPDP Act of 2023 highlights data privacy concerns for the disabled under section 9 to process their data.
On the international level, the newly incorporated EU’s AI Act mandates measures for transparent, safe, and fair access to AI systems along with including measures that are related to accessibility.
In the US, the Americans with Disabilities Act of 1990 and Section 508 of the 1998 amendment to the Rehabilitation Act of 1973 are the primary legislations that work on promoting digital accessibility in public services.
Challenges in implementing Regulations for AI Accessibility for PWDs
Defining the term ‘inclusive AI’ is a challenge. When working on implementing regulations and compliance for the accessibility of AI, if the primary work is left undefined, it makes the task of creating tools to address the issue an issue. The rapid pace of tech and AI development has more often outpaced legal frameworks in development. This leads to the creation of enforcement gaps. Countries like Canada and tech industry giants like Microsoft and Google are leading forces behind creating accessible AI innovations. Their regulatory frameworks focus on developing AI ethics with inclusivity and collaboration with disability rights groups.
India’s efforts in creating an inclusive AI include the redesign of the Sugamya Bharat app. The app had been created to assist PWDs and the elderly. It will now be incorporating AI features specifically to assist the intended users.
Though AI development has opportunities for inclusivity, unregulated development can be risky. Regulation plays a critical role in ensuring that AI-driven solutions prioritise inclusivity, fairness, and accessibility, harnessing AI’s potential to empower PWDs and contribute to a more inclusive society.
Conclusion
AI development can offer PWDs unprecedented independence and accessibility in leading their lives. The development of AI while keeping inclusivity and fairness in mind is needed to be prioritised. AI that is free from bias, combined with robust regulatory frameworks, together are essential in ensuring that AI serves equitably. Collaborations between tech developers, policymakers, and disability advocates need to be supported and promoted to build AI systems. This will in turn work towards bridging the accessibility gaps for PWDs. As AI continues to evolve, maintaining a steadfast commitment to inclusivity will be crucial in preventing marginalisation and advancing true technological progress for all.
References
- https://www.business-standard.com/india-news/over-1-4k-accessibility-related-complaints-filed-on-govt-app-75-solved-124090800118_1.html
- https://www.forbes.com/councils/forbesbusinesscouncil/2023/06/16/empowering-individuals-with-disabilities-through-ai-technology/ .
- https://hbr.org/2023/08/designing-generative-ai-to-work-for-people-with-disabilities
- Thehttps://blogs.microsoft.com/on-the-issues/2018/05/07/using-ai-to-empower-people-with-disabilities/andensur,personalization

Introduction
Beginning with the premise that the advent of the internet has woven a rich but daunting digital web, intertwining the very fabric of technology with the variegated hues of human interaction, the EU has stepped in as the custodian of this ever-evolving tableau. It is within this sprawling network—a veritable digital Minotaur's labyrinth—that the European Union has launched a vigilant quest, seeking not merely to chart its enigmatic corridors but to instil a sense of order in its inherent chaos.
The Digital Services Act (DSA) is the EU's latest testament to this determined pilgrimage, a voyage to assert dominion over the nebulous realms of cyberspace. In its latest sagacious move, the EU has levelled its regulatory lance at the behemoths of digital indulgence—Pornhub, XVideos, and Stripchat—monarchs in the realm of adult entertainment, each commanding millions of devoted followers.
Applicability of DSA
Graced with the moniker of Very Large Online Platforms (VLOPs), these titans of titillation are now facing the complex weave of duties delineated by the DSA, a legislative leviathan whose coils envelop the shadowy expanses of the internet with an aim to safeguard its citizens from the snares and pitfalls ensconced within. Like a vigilant Minotaur, the European Commission, the EU's executive arm, stands steadfast, enforcing compliance with an unwavering gaze.
The DSA is more than a mere compilation of edicts; it encapsulates a deeper, more profound ethos—a clarion call announcing that the wild frontiers of the digital domain shall be tamed, transforming into enclaves where the sanctity of individual dignity and rights is zealously championed. The three corporations, singled out as the pioneers to be ensnared by the DSA's intricate net, are now beckoned to embark on an odyssey of transformation, realigning their operations with the EU's noble envisioning of a safeguarded internet ecosystem.
The Paradigm Shift
In a resolute succession, following its first decree addressing 19 Very Large Online Platforms and Search Engines, the Commission has now ensconced the trinity of adult content purveyors within the DSA's embrace. The act demands that these platforms establish intuitive user mechanisms for reporting illicit content, prioritize communications from entities bestowed with the 'trusted flaggers' title, and elucidate to users the rationale behind actions taken to restrict or remove content. Paramount to the DSA's ethos, they are also tasked with constructing internal mechanisms to address complaints, forthwith apprising law enforcement of content hinting at criminal infractions, and revising their operational underpinnings to ensure the confidentiality, integrity, and security of minors.
But the aspirations of the DSA stretch farther, encompassing a realm where platforms are agents against deception and manipulation of users, categorically eschewing targeted advertisement that exploits sensitive profiling data or is aimed at impressionable minors. The platforms must operate with an air of diligence and equitable objectivity, deftly applying their terms of use, and are compelled to reveal their content moderation practices through annual declarations of transparency.
The DSA bestows upon the designated VLOPs an even more intensive catalogue of obligations. Within a scant four months of their designation, Pornhub, XVideos, and Stripchat are mandated to implement measures that both empower and shield their users—especially the most vulnerable, minors—from harms that traverse their digital portals. Augmented content moderation measures are requisite, with critical risk analyses and mitigation strategies directed at halting the spread of unlawful content, such as child exploitation material or the non-consensual circulation of intimate imagery, as well as curbing the proliferation and repercussions of deepfake-generated pornography.
The New Rules
The DSA enshrines the preeminence of protecting minors, with a staunch requirement for VLOPs to contrive their services so as to anticipate and enfeeble any potential threats to the welfare of young internet navigators. They must enact operational measures to deter access to pornographic content by minors, including the utilization of robust age verification systems. The themes of transparency and accountability are amplified under the DSA's auspices, with VLOPs subject to external audits of their risk assessments and adherence to stipulations, the obligation to maintain accessible advertising repositories, and the provision of data access to rigorously vetted researchers.
Coordinated by the Commission in concert with the Member States' Digital Services Coordinators, vigilant supervision will be maintained to ensure the scrupulous compliance of Pornhub, Stripchat, and XVideos with the DSA's stringent directives. The Commission's services are poised to engage with the newly designated platforms diligently, affirming that initiatives aimed at shielding minors from pernicious content, as well as curbing the distribution of illegal content, are effectively addressed.
The EU's monumental crusade, distilled into the DSA, symbolises a pledge—a testament to its steadfast resolve to shepherd cyberspace, ensuring the Minotaur of regulation keeps the bedlam at a manageable compass and the sacrosanctity of the digital realm inviolate for all who meander through its infinite expanses. As we cast our gazes toward February 17, 2024—the cusp of the DSA's comprehensive application—it is palpable that this legislative milestone is not simply a set of guidelines; it stands as a bold, unflinching manifesto. It beckons the advent of a novel digital age, where every online platform, barring small and micro-enterprises, will be enshrined in the lofty ideals imparted by the DSA.
Conclusion
As we teeter on the edge of this nascent digital horizon, it becomes unequivocally clear: the European Union's Digital Services Act is more than a mundane policy—it is a pledge, a resolute statement of purpose, asserting that amid the vast, interwoven tapestry of the internet, each user's safety, dignity, and freedoms are enshrined and hold the intrinsic significance meriting the force of the EU's legislative guard. Although the labyrinth of the digital domain may be convoluted with complexity, guided by the DSA's insightful thread, the march toward a more secure, conscientious online sphere forges on—resolute, unerring, one deliberate stride at a time.
References
https://ec.europa.eu/commission/presscorner/detail/en/ip_23_6763https://www.breakingnews.ie/world/three-of-the-biggest-porn-sites-must-verify-ages-under-eus-new-digital-law-1566874.html