#Fact Check – Analysis of Viral Claims Regarding India's UNSC Permanent Membership
Executive Summary:
Recently, there has been a massive amount of fake news about India’s standing in the United Security Council (UNSC), including a veto. This report, compiled scrupulously by the CyberPeace Research Wing, delves into the provenance and credibility of the information, and it is debunked. No information from the UN or any relevant bodies has been released with regard to India’s permanent UNSC membership although India has swiftly made remarkable progress to achieve this strategic goal.

Claims:
Viral posts claim that India has become the first-ever unanimously voted permanent and veto-holding member of the United Nations Security Council (UNSC). Those posts also claim that this was achieved through overwhelming international support, granting India the same standing as the current permanent members.



Factcheck:
The CyberPeace Research Team did a thorough keyword search on the official UNSC official website and its associated social media profiles; there are presently no official announcements declaring India's entry into permanent status in the UNSC. India remains a non-permanent member, with the five permanent actors- China, France, Russia, United Kingdom, and USA- still holding veto power. Furthermore, India, along with Brazil, Germany, and Japan (the G4 nations), proposes reform of the UNSC; yet no formal resolutions have come to the surface to alter the status quo of permanent membership. We then used tools such as Google Fact Check Explorer to uncover the truth behind these viral claims. We found several debunked articles posted by other fact-checking organizations.

The viral claims also lack credible sources or authenticated references from international institutions, further discrediting the claims. Hence, the claims made by several users on social media about India becoming the first-ever unanimously voted permanent and veto-holding member of the UNSC are misleading and fake.
Conclusion:
The viral claim that India has become a permanent member of the UNSC with veto power is entirely false. India, along with the non-permanent members, protests the need for a restructuring of the UN Security Council. However, there have been no official or formal declarations or commitments for alterations in the composition of the permanent members and their powers to date. Social media users are advised to rely on verified sources for information and refrain from spreading unsubstantiated claims that contribute to misinformation.
- Claim: India’s Permanent Membership in UNSC.
- Claimed On: YouTube, LinkedIn, Facebook, X (Formerly Known As Twitter)
- Fact Check: Fake & Misleading.
Related Blogs

Introduction
Ministry of Electronics and Information Technology (MeitY) Announces to Centre Government to Plan to Certify Permissible Online Games.
In a recent update to the notification released by the Ministry of Electronics and Information Technology (MeitY) on April 6, MeitY has requested gaming entities to establish self-regulatory organisations (SROs) within a timeframe of 30 days or a maximum of 90 days from the date of the notification, which is April 6, 2023. The Ministry of Electronics and Information Technology (MeitY) has further announced that the central government will certify which online games are permissible until the SROs are officially established. The intention behind establishing SROs is to assist intermediaries, such as Apple or Google, in determining what constitutes a permitted online game, but the SRO will take 2-3 months to complete. In the meanwhile, the Central government will step in and determine what is a permissible online game.
Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 & Intermediary Guidelines and Digital Media Ethics Code Amendment Rules, 2023
By enacting these rules, the Indian government has taken decisive action to protect Indian gamers and their financial resources against scams and fraud. The rules also serve to promote responsible gaming while preventing young and vulnerable users from being exposed to indecent or abusive content.
Amendment Rules developed the concept of a “Permissible online real money game.” This designation is reserved for games that have passed a review process conducted by a self-regulatory body (SRB). Amendment rules indicate that Online Gaming Intermediaries must ensure that they do not permit any third party to host non-permissible online real money games on their platforms. This development is important because it empowers us to distinguish between legitimate and illicit real money games.
The Amendment Rules define an online gaming provider as an “intermediary” under the Information Technology Act of 2000, creating a separate classification called ‘Online Gaming Intermediary’.

Central government to certify what is an ‘Online Permissible Game’
The industry has been wondering what games come under wagering and will be banned. So, until the SROs are officially established, the government, in the interim, will certify what is a permissible game, what is wagering, and what is not wagering. Games that involve elements of wagering are going to be barred. The new regulations prohibit wagering on any outcome, whether in skill-based or chance-based games. Hence gaming applications involving wagering and betting apps will be barred.
Self-Regulatory Organizations (SROs)
According to the new regulations by the Ministry of Electronics and Information Technology (MeitY), online gaming intermediaries must establish a Self-Regulatory Body (SRO) to approve games offered to users over the Internet. The SRO must be registered with the Ministry and develop a framework to ensure compliance with the IT Rules 2021 objectives. An ‘online game’ can be registered by the SRO if it meets specific criteria, which include that the game is offered by an online gaming intermediary that is a member of the self-regulatory body, the game is not containing any content harmful to India’s interests, and complying with all relevant Indian regulations. If these requirements are met, the intermediary can display a visible registration mark indicating its registration with the self-regulatory authority.
Conclusion
MeitY found that with the rapid growth of the gaming industry, the real money gaming (RMG) sector had to be regulated properly. Rules framed must be properly implemented to stop gambling, betting, and wagering apps.
The IT Rules 2021, along with the Amendment Rules 2023, are created to take concrete action to curb the proliferation of gambling, betting, and wagering apps in India. These rules empower to issue of directives to ban specific apps that facilitate or promote such activities. The app ban directive allows the government to take decisive action by blocking access to these apps, making them unavailable for download or use within the country. This measure is aimed at curbing the negative impact of gambling, betting, and wagering on individuals and society, including issues related to addiction, financial loss, and illegal activities. Rules aim to actively combat the spread and influence of such apps and provide a safer online environment for gaming users.
The self-regulatory body in the context of online gaming will have the authority to grant membership to gaming intermediaries, register online games, develop a framework for regulation, interact with the Central Government, address user complaints, report instances of non-compliance, and take necessary actions to safeguard online gaming users.

Introduction
In the digital realm of social media, Meta Platforms, the driving force behind Facebook and Instagram, faces intense scrutiny following The Wall Street Journal's investigative report. This exploration delves deeper into critical issues surrounding child safety on these widespread platforms, unravelling algorithmic intricacies, enforcement dilemmas, and the ethical maze surrounding monetisation features. Instances of "parent-managed minor accounts" leveraging Meta's subscription tools to monetise content featuring young individuals have raised eyebrows. While skirting the line of legality, this practice prompts concerns due to its potential appeal to adults and the associated inappropriate interactions. It's a nuanced issue demanding nuanced solutions.
Failed Algorithms
The very heartbeat of Meta's digital ecosystem, its algorithms, has come under intense scrutiny. These algorithms, designed to curate and deliver content, were found to actively promoting accounts featuring explicit content to users with known pedophilic interests. The revelation sparks a crucial conversation about the ethical responsibilities tied to the algorithms shaping our digital experiences. Striking the right balance between personalised content delivery and safeguarding users is a delicate task.
While algorithms play a pivotal role in tailoring content to users' preferences, Meta needs to reevaluate the algorithms to ensure they don't inadvertently promote inappropriate content. Stricter checks and balances within the algorithmic framework can help prevent the inadvertent amplification of content that may exploit or endanger minors.
Major Enforcement Challenges
Meta's enforcement challenges have come to light as previously banned parent-run accounts resurrect, gaining official verification and accumulating large followings. The struggle to remove associated backup profiles adds layers to concerns about the effectiveness of Meta's enforcement mechanisms. It underscores the need for a robust system capable of swift and thorough actions against policy violators.
To enhance enforcement mechanisms, Meta should invest in advanced content detection tools and employ a dedicated team for consistent monitoring. This proactive approach can mitigate the risks associated with inappropriate content and reinforce a safer online environment for all users.
The financial dynamics of Meta's ecosystem expose concerns about the exploitation of videos that are eligible for cash gifts from followers. The decision to expand the subscription feature before implementing adequate safety measures poses ethical questions. Prioritising financial gains over user safety risks tarnishing the platform's reputation and trustworthiness. A re-evaluation of this strategy is crucial for maintaining a healthy and secure online environment.
To address safety concerns tied to monetisation features, Meta should consider implementing stricter eligibility criteria for content creators. Verifying the legitimacy and appropriateness of content before allowing it to be monetised can act as a preventive measure against the exploitation of the system.
Meta's Response
In the aftermath of the revelations, Meta's spokesperson, Andy Stone, took centre stage to defend the company's actions. Stone emphasised ongoing efforts to enhance safety measures, asserting Meta's commitment to rectifying the situation. However, critics argue that Meta's response lacks the decisive actions required to align with industry standards observed on other platforms. The debate continues over the delicate balance between user safety and the pursuit of financial gain. A more transparent and accountable approach to addressing these concerns is imperative.
To rebuild trust and credibility, Meta needs to implement concrete and visible changes. This includes transparent communication about the steps taken to address the identified issues, continuous updates on progress, and a commitment to a user-centric approach that prioritises safety over financial interests.
The formation of a task force in June 2023 was a commendable step to tackle child sexualisation on the platform. However, the effectiveness of these efforts remains limited. Persistent challenges in detecting and preventing potential child safety hazards underscore the need for continuous improvement. Legislative scrutiny adds an extra layer of pressure, emphasising the urgency for Meta to enhance its strategies for user protection.
To overcome ongoing challenges, Meta should collaborate with external child safety organisations, experts, and regulators. Open dialogues and partnerships can provide valuable insights and recommendations, fostering a collaborative approach to creating a safer online environment.
Drawing a parallel with competitors such as Patreon and OnlyFans reveals stark differences in child safety practices. While Meta grapples with its challenges, these platforms maintain stringent policies against certain content involving minors. This comparison underscores the need for universal industry standards to safeguard minors effectively. Collaborative efforts within the industry to establish and adhere to such standards can contribute to a safer digital environment for all.
To align with industry standards, Meta should actively participate in cross-industry collaborations and adopt best practices from platforms with successful child safety measures. This collaborative approach ensures a unified effort to protect users across various digital platforms.
Conclusion
Navigating the intricate landscape of child safety concerns on Meta Platforms demands a nuanced and comprehensive approach. The identified algorithmic failures, enforcement challenges, and controversies surrounding monetisation features underscore the urgency for Meta to reassess and fortify its commitment to being a responsible digital space. As the platform faces this critical examination, it has an opportunity to not only rectify the existing issues but to set a precedent for ethical and secure social media engagement.
This comprehensive exploration aims not only to shed light on the existing issues but also to provide a roadmap for Meta Platforms to evolve into a safer and more responsible digital space. The responsibility lies not just in acknowledging shortcomings but in actively working towards solutions that prioritise the well-being of its users.
References
- https://timesofindia.indiatimes.com/gadgets-news/instagram-facebook-prioritised-money-over-child-safety-claims-report/articleshow/107952778.cms
- https://www.adweek.com/blognetwork/meta-staff-found-instagram-tool-enabled-child-exploitation-the-company-pressed-ahead-anyway/107604/
- https://www.tbsnews.net/tech/meta-staff-found-instagram-subscription-tool-facilitated-child-exploitation-yet-company

On the occasion of 20th edition of Safer Internet Day 2023, CyberPeace in collaboration with UNICEF, DELNET, NCERT, and The National Book Trust (NBT), India, took steps towards safer cyberspace by launching iSafe Multimedia Resources, CyberPeace TV, and CyberPeace Café in an event held today in Delhi.
CyberPeace also showcased its efforts, in partnership with UNICEF, to create a secure and peaceful online world through its Project iSafe, which aims to bridge the knowledge gap between emerging advancements in cybersecurity and first responders. Through Project iSafe, CyberPeace has successfully raised awareness among law enforcement agencies, education departments, and frontline workers across various fields. The event marked a significant milestone in the efforts of the foundation to create a secure and peaceful online environment for everyone.
Launching the Cyberpeace TV, café and isafe material , National Cybersecurity coordinator of Govt of India, Lt Gen Rajesh Pant interacts with the students by introducing them with the theme of this safer internet day. He launched the coword cyber challenge initiative by the countries. Content is most important in cyberspace. He also assured everyone that the government of India is taking a lot of steps at national level to make cyber space safer. He compliments CPF for their initiatives.
Ms. Zafrin Chaudhry, Chief of Communication, UNICEF addresses students with the facts that children make out 1 out of 3 in cyberspace, so they should have a safe cyberspace. They should be informed and equipped with all the information on how to deal with any kind of issues they face in cyberspace. They should share their experience with everyone to make others aware. UNICEF in partnership with CPF is extending help to children to equip them with the help and information.
Major Vineet Kumar, Founder and Global President of CPF welcomed all and introduced us about the launching of iSafe Multimedia Resources, CyberPeace TV, and CyberPeace Café . With this launch he threw some light on upcoming plans like launching a learning module of metaverse with AR and VR. He wants to make cyberspace safe even in tier 3 cities that’s why he established the first cybercafé in Ranchi.
As the internet plays a crucial role in our lives, CyberPeace has taken action to combat potential cyber threats. They introduced CyberPeace TV, the world’s first multilingual TV Channel on Jio TV focusing on Education and Entertainment, a comprehensive online platform that provides the latest in cybersecurity news, expert analysis, and a community for all stakeholders in the field. CyberPeace also launched its first CyberPeace Café for creators and innovators and released the iSafe Multimedia resource containing Flyers, Posters, E hand book and handbook on digital safety for children developed jointly by CyberPeace, UNICEF and NCERT for the public.
O.P. Singh, Former DGP, UP Police & CEO Kailash Satyarthi foundation, , started with the data of internet users in India. The Internet is used in day-to -day activities nowadays and primarily in social media. Students should have a channelized approach to cyberspace like fixed screen time, information to the right content, and usage of the internet. I really appreciate the initiates that CyberPeace is taking in this direction.
The celebration continued by iSafe Panel Discussion on “Creating Safer Cyberspace for Children.” The discussion was moderated by Dr. Sangeeta Kaul, Director of DELNET, and was attended by panellists Mr. Rakesh Maheshwari from MeitY(Ministry of electronics and information Technology, Govt. of India), Dr. Indu Kumar from CIET-NCERT, Ms. Bindu Sharma from ICMEC, and Major Vineet Kumar from CyberPeace.
The event was also graced by professional artists from the National School of Drama, who performed Nukkad Natak and Qawwali based on cyber security themes. Students from SRDAV school also entertained the audience with their performances. The attendees were also given a platform to share their experiences with online security issues, and ICT Awardees, Parents and iSafe Champions shared their insights with the guests. The event also had stalls by CyberPeace Corps, a Global volunteer initiative, and CIET-NCERT for students to explore and join the cause. The event’s highlight was the 360 Selfie Booth, where attendees lined up to have their turn.