#FactCheck - Debunking Viral Photo: Tears of Photographer Not Linked to Ram Mandir Opening
Executive Summary:
A photographer breaking down in tears in a viral photo is not connected to the Ram Mandir opening. Social media users are sharing a collage of images of the recently dedicated Lord Ram idol at the Ayodhya Ram Mandir, along with a claimed shot of the photographer crying at the sight of the deity. A Facebook post that posts this video says, "Even the cameraman couldn't stop his emotions." The CyberPeace Research team found that the event happened during the AFC Asian Cup football match in 2019. During a match between Iraq and Qatar, an Iraqi photographer started crying since Iraq had lost and was out of the competition.
Claims:
The photographer in the widely shared images broke down in tears at seeing the icon of Lord Ram during the Ayodhya Ram Mandir's consecration. The Collage was also shared by many users in other Social Media like X, Reddit, Facebook. An Facebook user shared and the Caption of the Post reads,




Fact Check:
CyberPeace Research team reverse image searched the Photographer, and it landed to several memes from where the picture was taken, from there we landed to a Pinterest Post where it reads, “An Iraqi photographer as his team is knocked out of the Asian Cup of Nations”

Taking an indication from this we did some keyword search and tried to find the actual news behind this Image. We landed at the official Asian Cup X (formerly Twitter) handle where the image was shared 5 years ago on 24 Jan, 2019. The Post reads, “Passionate. Emotional moment for an Iraqi photographer during the Round of 16 clash against ! #AsianCup2019”

We are now confirmed about the News and the origin of this image. To be noted that while we were investigating the Fact Check we also found several other Misinformation news with the Same photographer image and different Post Captions which was all a Misinformation like this one.
Conclusion:
The recent Viral Image of the Photographer claiming to be associated with Ram Mandir Opening is Misleading, the Image of the Photographer was a 5 years old image where the Iraqi Photographer was seen Crying during the Asian Cup Football Competition but not of recent Ram Mandir Opening. Netizens are advised not to believe and share such misinformation posts around Social Media.
- Claim: A person in the widely shared images broke down in tears at seeing the icon of Lord Ram during the Ayodhya Ram Mandir's consecration.
- Claimed on: Facebook, X, Reddit
- Fact Check: Fake
Related Blogs

Introduction
In an era where digitalization is transforming every facet of life, ensuring that personal data is protected becomes crucial. The enactment of the Digital Personal Data Protection Act, 2023 (DPDP Act) is a significant step that has been taken by the Indian Parliament which sets forth a comprehensive framework for Digital Personal Data. The Draft Digital Personal Data Protection Rules, 2025 has recently been released for public consultation to supplement the Act and ensure its smooth implementation once finalised. Though noting certain positive aspects, there is still room for addressing certain gaps and multiple aspects under the draft rules that require attention. The DPDP Act, 2023 recognises the individual’s right to protect their personal data providing control over the processing of personal data for lawful purposes. This Act applies to data which is available in digital form as well as data which is not in digital form but is digitalised subsequently. While the Act is intended to offer wide control to the individuals (Data Principal) over their personal information, its impact on vulnerable groups such as ‘Persons with Disabilities’ requires closer scrutiny.
Person with Disabilities as data principal
The term ‘data principal’ has been defined under the DPDP Act under Section 2(j) as a person to whom the personal data is related to, which also includes a person with a disability. A lawful guardian acting on behalf of such person with disability has also been included under the ambit of this definition of Data Principal. As a result, a lawful guardian acting on behalf of a person with disability will have the same rights and responsibilities as a data principal under the Act.
- Section 9 of the DPDP Act, 2023 states that before processing the personal data of a person with a disability who has a lawful guardian, the data fiduciary must obtain verifiable consent from that guardian, ensuring proper protection of the person with disability's data privacy.
- The data principal has the right to access information about personal data under Section 11 which is being processed by the data fiduciary.
- Section 12 provides the right to correction and erasure of personal data by making a request in a manner prescribed by the data fiduciary.
- A right to grievance redressal must be provided to the data principal in respect of any act or omission of performance of obligations by the data fiduciary or the consent manager.
- Under Section 14, the data principal has the right to nominate any other person to exercise the rights provided under the Act in case of death or incapacity.
Provision of consent and its implication
The three key components of Consent that can be identified under the DPDP Act, are:
- Explicit and Informed Consent: Consent given for the processing of data by the data principal or a lawful guardian in case of persons with disabilities must be clear, free and informed as per section 6 of the Act. The data fiduciary must specify the itemised description of the personal data required along with the specified purpose and description of the goods or services that would be provided by such processing of data. (Rule 3 under Draft Digital Personal Data Protection Rules)
- Verifiable Consent: Section 9 of the DPDP Act provides that the data fiduciary needs to obtain verifiable consent of the lawful guardian before processing any personal data of such a person with a disability. Rule 10 of the Draft Rules obligates the data fiduciary to adopt measures to ensure that the consent given by the lawful guardian is verifiable before the is processed.
- Withdrawal of Consent: Data principal or such lawful guardian has the option to withdraw consent for the processing of data at any point by making a request to the data fiduciary.
Although the Act includes certain provisions that focus on the inclusivity of persons with disability, the interpretation of such sections says otherwise.
Concerns related to provisions for Persons with Disabilities under the DPDP Act:
- Lack of definition of ‘person with disabilities’: The DPDP Act or the Draft Rules does not define the term ‘persons with disabilities’. This will create confusion as to which categories of disability are included and up to what percentage. The Rights of Persons with Disabilities Act, 2016 clearly defines ‘person with benchmark disability’, ‘person with disability’ and ‘person with disability having high support needs’. This categorisation is essential to determine up to what extent a person with disability needs a lawful guardian which is missing under the DPDP Act.
- Lack of autonomy: Though the definition of data principal includes persons with disabilities however the decision-making authority has been given to the lawful guardian of such individuals. The section creates ambiguity for people who have a lower percentage of disability and are capable of making their own decisions and have no autonomy in making decisions related to the processing of their personal data because of the lack of clarity in the definition of ‘persons with disabilities’.
- Safeguards for abuse of power by lawful guardian: The lawful guardian once verified by the data fiduciary can make decisions for the persons with disabilities. This raises concerns regarding the potential abuse of power by lawful guardians in relation to the handling of personal data. The DPDP Act does not provide any specific protection against such abuse.
- Difficulty in verification of consent: The consent obtained by the Data Fiduciary must be verified. The process that will be adopted for verification is at the discretion of the data fiduciary according to Rule 10 of the Draft Data Protection Rules. The authenticity of consent is difficult to determine as it is a complex process which lacks a standard format. Also, with the technological advancements, it would be challenging to identify whether the information given to verify the consent is actually true.
CyberPeace Recommendations
The DPDP Act, 2023 is a major step towards making the data protection framework more comprehensive, however, the provisions related to persons with disabilities and powers given to lawful guardians acting on their behalf still need certain clarity and refinement within the DPDP Act framework.
- Consonance of DPDP with Rights of Persons with Disabilities (RPWD) Act, 2016: The RPWD and DPDP Act should supplement each other and can be used to clear the existing ambiguities. Such as the definition of ‘persons with disabilities’ under the RPWD Act can be used in the context of the DPDP Act, 2023.
- Also, there must be certain mechanisms and safeguards within the Act to prevent abuse of power by the lawful guardian. The affected individual in case of suspected abuse of power should have an option to file a complaint with the Data Protection Board and the Board can further take necessary actions to determine whether there is abuse of power or not.
- Regulatory oversight and additional safeguards are required to ensure that consent is obtained in a manner that respects the rights of all individuals, including those with disabilities.
References:
- https://www.meity.gov.in/writereaddata/files/Digital%20Personal%20Data%20Protection%20Act%202023.pdf
- https://www.meity.gov.in/writereaddata/files/259889.pdf
- https://www.indiacode.nic.in/bitstream/123456789/15939/1/the_rights_of_persons_with_disabilities_act%2C_2016.pdf
- https://www.deccanherald.com/opinion/consent-disability-rights-and-data-protection-3143441
- https://www.pacta.in/digital-data-protection-consent-protocols-for-disability.pdf
- https://www.snrlaw.in/indias-new-data-protection-regime-tracking-updates-and-preparing-for-compliance/

AI systems have grown in both popularity and complexity on which they operate. They are enhancing accessibility for all, including people with disabilities, by revolutionising sectors including healthcare, education, and public services. We are at the stage where AI-powered solutions that can help people with mental, physical, visual or hearing impairments perform everyday and complex tasks are being created.
Generative AI is now being used to amplify human capability. The development of tools for speech-to-text and image recognition is helping in facilitating communication and interaction for visually or hearing-impaired individuals, and smart prosthetics are providing tailored support. Unfortunately, even with these developments, PWDs have continued to face challenges. Therefore, it is important to balance innovation with ethical considerations aand ensuring that these technologies are designed with qualities like privacy, equity, and inclusivity in mind.
Access to Tech: the Barriers Faced by PWDs
PWDs face several barriers while accessing technology. Identifying these challenges is important as they lack computer accessibility, in the use of hardware and software, which has become a norm in life nowadays. Website functions that only work when users click with a mouse, self-service kiosks without accessibility features, touch screens without screen reader software or tactile keyboards, and out-of-order equipment, such as lifts, captioning mirrors and description headsets, are just some difficulties that they face in their day-to-day life.
While they are helpful, much of the current technology doesn’t fully address all disabilities. For example, many assistive devices focus on visual or mobility impairments, but they fall short of addressing cognitive or sensory conditions. In addition to this, these solutions often lack personalisation, making them less effective for individuals with diverse needs. AI has significant potential to bridge this gap. With adaptive systems like voice assistants, real-time translation, and personalised features, AI can create more inclusive solutions, improving access to both digital and physical spaces for everyone.
The Importance of Inclusive AI Design
Creating an Inclusive AI design is important. It ensures that PWDs are not excluded from technological advancements because of the impairments that they are suffering from. The concept of an ‘inclusive or universal’ design promotes creating products and services that are usable for the widest possible range of people. Tech Developers have an ethical responsibility to create advancements in AI that serve everyone. Accessibility features should be built into the core design. They should be treated as a practice rather than an afterthought. However, bias in AI development often stems from data of a non-representative nature, or assumptions can lead to systems that overlook or poorly serve PWDs. If AI algorithms are trained on limited or biased data, they risk excluding marginalised groups, making ethical, inclusive design a necessity for equity and accessibility.
Regulatory Efforts to Ensure Accessible AI
In India, the Rights of Persons with Disabilities Act of 2016 impresses upon the need to provide PWDs with equal accessibility to technology. Subsequently, the DPDP Act of 2023 highlights data privacy concerns for the disabled under section 9 to process their data.
On the international level, the newly incorporated EU’s AI Act mandates measures for transparent, safe, and fair access to AI systems along with including measures that are related to accessibility.
In the US, the Americans with Disabilities Act of 1990 and Section 508 of the 1998 amendment to the Rehabilitation Act of 1973 are the primary legislations that work on promoting digital accessibility in public services.
Challenges in implementing Regulations for AI Accessibility for PWDs
Defining the term ‘inclusive AI’ is a challenge. When working on implementing regulations and compliance for the accessibility of AI, if the primary work is left undefined, it makes the task of creating tools to address the issue an issue. The rapid pace of tech and AI development has more often outpaced legal frameworks in development. This leads to the creation of enforcement gaps. Countries like Canada and tech industry giants like Microsoft and Google are leading forces behind creating accessible AI innovations. Their regulatory frameworks focus on developing AI ethics with inclusivity and collaboration with disability rights groups.
India’s efforts in creating an inclusive AI include the redesign of the Sugamya Bharat app. The app had been created to assist PWDs and the elderly. It will now be incorporating AI features specifically to assist the intended users.
Though AI development has opportunities for inclusivity, unregulated development can be risky. Regulation plays a critical role in ensuring that AI-driven solutions prioritise inclusivity, fairness, and accessibility, harnessing AI’s potential to empower PWDs and contribute to a more inclusive society.
Conclusion
AI development can offer PWDs unprecedented independence and accessibility in leading their lives. The development of AI while keeping inclusivity and fairness in mind is needed to be prioritised. AI that is free from bias, combined with robust regulatory frameworks, together are essential in ensuring that AI serves equitably. Collaborations between tech developers, policymakers, and disability advocates need to be supported and promoted to build AI systems. This will in turn work towards bridging the accessibility gaps for PWDs. As AI continues to evolve, maintaining a steadfast commitment to inclusivity will be crucial in preventing marginalisation and advancing true technological progress for all.
References
- https://www.business-standard.com/india-news/over-1-4k-accessibility-related-complaints-filed-on-govt-app-75-solved-124090800118_1.html
- https://www.forbes.com/councils/forbesbusinesscouncil/2023/06/16/empowering-individuals-with-disabilities-through-ai-technology/ .
- https://hbr.org/2023/08/designing-generative-ai-to-work-for-people-with-disabilities
- Thehttps://blogs.microsoft.com/on-the-issues/2018/05/07/using-ai-to-empower-people-with-disabilities/andensur,personalization
.webp)
Introduction
In the fast-paced digital age, misinformation spreads faster than actual news. This was seen recently when inaccurate information on social media was spread, stating that the Election Commission of India (ECI) had taken down e-voter rolls for some states from its website overnight. The rumour confused the public and caused political debate in states like Maharashtra, MP, Bihar, UP and Haryana, resulting in public confusion. But the ECI quickly called the viral information "fake news" and made sure that voters could still get access to the electoral rolls of all States and Union Territories, available at voters.eci.gov.in. The incident shows how electoral information could be harmed by the impact of misinformation and how important it is to verify the authenticity.
The Incident and Allegations
On August 7, 2025, social media posts on platforms like X and WhatsApp claimed that the Election Commission of India had removed e-voter lists from its website. The posts appeared after public allegations about irregularities in certain constituencies. However, the claims about the removal of voter lists were unverified.
The Election Commission’s Response
In a formal tweet posted on X, it stated categorically:
“This is a fake news. Anyone can download the Electoral Roll for any of 36 States/UTs through this link: https://voters.eci.gov.in/download-eroll.”
The Commission clarified that no deletion has been done at all and that all the voters' rolls are intact and accessible to the public. Keeping with the spirit of transparency, the ECI reaffirmed its overall practice of public access to electoral information by clarifying that the system is intact and accessible for inspection.
Importance of Timely Clarifications
By countering factually incorrect information the moment it was spread on a large scale, the ECI stopped possible harm to public trust. Election officials rely upon being trusted, and any speculation concerning their honesty can prove harmful to democracy. Such prompt action stops false information from becoming a standard in public discourse.
Misinformation in the Electoral Space
- How False Narratives Gain Traction
Election misinformation increases in significant political environments. Social media, confirmation bias, and increased emotional states during elections enable rumour spread. On this occasion, the unfounded report struck a chord with widespread political distrust, and hence, people easily believed and shared it without checking if it was true or not.
- Risks to Democratic Integrity
When misinformation impacts election procedures, the consequences can be profound:
- Erosion of Trust: People can lose faith in the neutrality of election administrators quite easily.
- Polarization: Untrue assertions tend to reinforce political divides, rendering constructive communication more difficult.
- The Role of Media Literacy
Combating such mis-disinformation requires more than official statements. Media skills training courses can equip individuals with the ability to recognise warning signs in suspect messages. Even basic actions like checking official sources prior to sharing can move far in keeping untruths from being spread.
Strategies to Counter Electoral Misinformation
Multi-Stakeholder Action
Effective counteracting of electoral disinformation requires coordination among election officials, fact-checkers, media, and platforms. Actions that are suggested include:
- Rapid Response Protocols: Institutions should maintain dedicated monitoring teams for quick rebuttals.
- Confirmed Channels of Communication: Providing official sites and pages for actual electoral news.
- Proactive Transparency: Regular publication of electoral process updates can anticipate rumours.
- Platform Accountability: Social media sites must label or limit the visibility of information found to be false by credentialed fact-checkers.
Conclusion
The recent allegations of e-voter rolls deletion underscore the susceptibility of contemporary democracies to mis-disinformation. Even though the circumstances were brought back into order by the ECI's swift and unambiguous denunciation, the incident itself serves to emphasise the necessity of preventive steps to maintain election faith. Even though fact-checking alone might not work in an environment where the information space is growing more polarised and algorithmic, the long-term solution to such complications is to grow an ironclad democratic culture where everyone, every organisation, and platforms value the truth over clickbait. The lesson is clear: in the age of instant news, accurate communication is vital for maintaining democratic integrity, not extravagances.
References
- https://www.newsonair.gov.in/election-commission-dismisses-fake-news-on-removal-of-e-voter-rolls/
- https://economictimes.indiatimes.com/news/india/eci-dismisses-claims-of-removing-e-voter-rolls-from-its-website-calls-it-fake-news/articleshow/123190662.cms
- https://www.thehindu.com/news/national/vote-theft-claim-of-congress-factually-incorrect-election-commission/article69921742.ece
- https://www.thehindu.com/opinion/editorial/a-crisis-of-trust-on-the-election-commission-of-india/article69893682.ece