Digital India Act – Looking through the Crystal Ball of a new Digital India

November 2024

1.        Introduction

India’s plans to embrace and encourage innovation and digital transformation began with the 2015 Digital India initiative,[1] focused on improved infrastructure like high-speed internet, cloud services, artificial intelligence and electronic cashless financial transactions. The number of total internet users in India nearly quadrupled from around 252 million in 2014 to around 955 million in 2024.[2] However, the growth led to a drastic surge in the volume of personal data generated, exchanged, collected and processed. Consequently, privacy risks including data breaches and leaks, identity theft, and unauthorized access to sensitive information have worsened. This has underscored vulnerabilities in the existing legal framework. Recognizing this, in 2022, the government proposed to replace the Information Technology (“IT Act or Act”) Act of 2000 with the Digital India Act (“DIA”). However, as of November 2024, the DIA is not close to being implemented. The government presented the intended structure and changes to be introduced at a Digital India Dialogue conference in March 2023. Since then, it has been under consultation between the government and stakeholders including subject matter experts and technical teams engaged in designing, developing, and deploying emerging technology-based solutions. While there is no clarity on when a draft Bill will be published, analyzing the proposal is useful to get a sense of what the government intends to change and its consequential impact for digital and tech companies.

This newsletter details changes intended to be introduced in the IT regime and the potential impact on companies and analyzes whether the proposed law can succeed in revolutionizing cyber governance.

2.        The Inadequacies of the Existing Regime 

When the IT Act was introduced in 2000, the internet was in its infancy with many of today’s technologies like social media, e-commerce, and AI non-existent or not widely used. The law as it stands today is, therefore, based on the digital landscape of the time. For instance, it penalizes traditional offences such as hacking, identity theft and sharing obscene material, specifically child pornography, etc. With time, different rules were enacted to address other aspects, including (a) 2013 Indian Computer Emergency Response Team (CERT-In) rules and corresponding 2022 directions.[3] CERT-In was created to handle cybersecurity incidents and  to impose mandatory reporting obligation; (b) 2021 intermediary rules[4] to recognize and define social media and online gaming intermediaries, and detail how intermediaries must act diligently, including by communicating grievance redressal mechanisms and privacy policies, and (c) 2011 Sensitive Personal Data or Information (“SPDI”) rules to list types of SPDI and provide for secure collection and transfer.

While these were initially sufficient, with an evolving landscape, the IT Act has become inadequate, apparent from the fact that prevalent technologies like cookies, AI, and blockchain, and cybercrimes like cyberbullying and trolling are not covered. Various key provisions are vaguely worded, leading to inconsistent interpretation and enforcement, and misuse of law. Illustrative examples are below.

  • The 2021 intermediary rules require intermediaries to prohibit display and sharing misleading, false and untrue information, but does not define what constitutes misinformation or fake news. Thus, AI-generated content may not be considered.
  • Section 67 of the IT Act penalizes transmission of obscene material but fails to clarify what constitutes obscenity and differentiate consensual and non-consensual acts of sending such material. Consequently, consensual acts could be prosecuted irrespective of context or consent.
  • The IT Act also does not account for (a) technology advancements and new cybercrime forms. For example, it penalizes capturing and sharing nude images of a person without consent, but fails to recognize that with deepfake technology, images do not need to be captured in the traditional sense and can be morphed to appear sexual; (b) online harassment and, therefore, cyber bullying, trolling,[5] cyber stalking, etc. are outside its scope.
  • Furthermore, the Act deals with data protection ineffectively and fails to address all data types subject to leaks. Rule 3 of SPDI rules covers an exhaustive list of information including password, financial and biometric information, medical records and history, sexual orientation, and physical, physiological and mental health condition. But other personal information like IP address and other online identifiers are ignored.
  • Lastly, the Act includes telecom, network, search engines, online-payment, and online-marketplaces under one common definition of intermediaries, failing to consider that each is different and cannot be regulated under a “one rule for all” approach.

3.        Reforms under the DIA

The need for a modern comprehensive data protection law was triggered by the 2017 Justice K.S. Puttuswamy judgement[6] where right to privacy was declared as a fundamental right. While the Digital Personal Data Protection Act is the primary law on this, it was necessary to simultaneously reform India’s digital ecosystem and the relevant governing law. Acknowledging the redundancy of the IT Act caused by its stagnant exhaustive nature, the government decided to adopt a mixed “principle and rule-based approach” to make the DIA evolvable and adaptable to fluid technology trends while aligning with stringent global standards. As previously mentioned, the government is still in consultation with key stakeholders and has released a proposal for the DIA mentioning and explaining the key principles and changes intended to be introduced in the final version. Some of these are as follows.

3.1        Separate rules for each class of intermediaries: To better regulate various kinds of present and emerging service providers, DIA intends to discontinue the one-size-fits-all approach. Instead, it will classify intermediaries into e-commerce, digital media, search engines, gaming, AI, ad-tech, OTT platforms, and TSPs,[7] etc., and implement separate rules and regulations to place appropriate liability based on their unique technical and functional structure.

3.2       Removal of the safe harbor principle: The principle currently enshrined in section 79 of the IT Act exempts intermediaries from liability for third-party content on their platforms, subject to pre-requisites which expect the intermediary to be diligent and not initiate or be directly involved in the transmission of offensive, misleading, obscene, or untrue information. Under the DIA, the government will either entirely revoke safe harbor or impose tough conditions to claim exemption from liability, so intermediaries are more accountable in monitoring and controlling harmful objectionable content and are compelled to implement effective content moderation and fact-checking mechanisms. Complete removal may be excessively harsh since intermediaries cannot be expected to be constantly aware of the kind of content shared on them. As a result, they may be implicated unnecessarily, irrespective of their involvement in the publishing of unlawful or offensive content. Instead, ideally, the exemption should be enforced on a case-to-case basis with more stringent pre-requisites for one to claim safe harbor.

3.3       Mandate age-gating and “Do Not Track” option: The DIA will prioritize children’s safety and data privacy, particularly on browsing, social media and gaming platforms. Therefore, it will mandate age-gating by platforms, sites, and technology considered addictive, i.e., accurately verify the age of the user so that minors are prevented from gaining access or viewing content considered inappropriate or harmful. Additionally, such platforms will have to provide a Do Not Track option to minors so they may opt out of cookies being stored by the site or platform on their device. This will ensure the minor’s data is not collected, say for targeted advertising, and will be an added data privacy layer for them.

3.4       Strict regulation of AI and data privacy: The overarching goal of the DIA is to create a safe regulated environment online. It will identify and define “high-risk AI systems,”[8]and require companies to regularly assess vulnerability and security measures to handle threats such as data leaks and implementation of legal and regulatory safeguards like “algorithmic accountability.”[9] In case of data leaks or exposure to harmful AI-generated content, the company involved will have to provide users with effective user-friendly reporting mechanisms.

3.5       Ensuring an open internet, accessible to all: In the DIA proposal, the government stressed on the need for (i) equal access for all citizens to online content and services, (ii) providing choices and control to internet users over their online experiences, and (iii) promoting fair trade and a competitive environment where startups and small businesses can fairly compete with large corporations, leading to greater innovation. This is rooted in the concept of net neutrality which stipulates service providers must treat all data on the internet equally, without favoring or blocking particular products or websites. To address this, the DIA will require that internet service providers do not prioritize their own services over competitors and that interoperable[10] platforms be promoted. Service providers and platforms will need to conduct periodic risk assessments and maintain “algorithmic transparency.”[11].

4.        Conclusion

DIA was proposed as a modern equitable and effective solution to the redundant IT regulatory regime. Although no draft Bill has been released and there is no clarity on how it will define key terms like “high-risk AI systems” and “algorithmic transparency,” or implement key reforms, the proposal released in 2023 demonstrates the government’s focus on emerging technologies, protection of minors, encouragement of competition, stricter and more effective adjudicatory and penalizing provisions, and greater accountability. Evidently, user benefit is the prime focus and the corollary is that service providers, intermediaries, data processors, etc. will have to reassess their internal systems and processes, both regulatory and technical in light of increased stricter compliances. All-in-all, the DIA is a reassuring answer to a change that has been a long time coming. The reformed law presents an opportunity not only to fix gaps in the language and implementation of the existing regulations, but also introduce concepts that have been recognized and adopted globally. However, its success will depend entirely on the actual text, the speed with which it is passed and, of course, its enforcement in a swift and effective manner.

Author

Dylan Sharma

[1] Flagship program launched by the Department of Electronics and Information Technology in July 2015, aiming to digitally empower citizens by making digital infrastructure like high-speed internet, phones, secure cyber space, and cloud services available and accessible to all, especially in rural areas

[2] Ministry of Communications, PIB Delhi, Universal connectivity and Digital India initiatives reaching to all areas including tier-2/3 cities and villages

https://pib.gov.in/PressReleaseIframePage.aspx?PRID=2040566#:~:text=As%20of%20March%202024%2C%20out,having%203G%2F4G%20mobile%20connectivity, Last accessed on Nov 14, 2024

[3] CERT-In, Directions relating to information security practices, procedure, prevention, response and reporting of cyber incidents for safe & trusted internet, No. 20(3)/2022-CERT-In, April 28, 2022

[4] IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 through G.S.R 139(E) dated Feb 25, 2021

[5] The Information Security Awareness wing of MeitY has defined this as posting inflammatory, abusive, controversial, offensive, irrelevant messages to upset or provoke a person to lash out or display emotional responses

[6] Justice K.S Puttuswamy (Retd.) & Anr. vs. UOI & Ors., AIR 2017 SCC 4161, decided on Aug 24, 2017

[7] Token Service Providers are intermediaries between the customer and service providers facilitating the exchange of sensitive payment information in a secure unique tokenized format

[8] Annexure III of the EU AI Act defines as AI systems used in biometrics and identification systems, management and operation of critical digital infrastructure, road traffic, supply of water, gas, heating, or electricity, education and vocational training, recruitment and management of workers, evaluate credit worthiness or credit score, risk assessment and pricing for life and health insurance, by law enforcement, etc.

[9] The process of allocating responsibility to responsible entities for the consequences of real-world actions influenced by algorithms used in decision-making processes

[10] Ability of an application, device or software to securely and automatically connect, communicate, and exchange data with other such entities

[11] The principle is factors which influence an algorithm’s decisions must be visible and understandable to users, regulators and anyone affected by it. This allows users to understand how it works and how it arrives at outcomes

We are using cookies to give you the best experience. You can find out more about which cookies we are using or switch them off in privacy settings.
AcceptPrivacy Settings

GDPR

 

DISCLAIMER

The Bar Council of India restricts advocates from maintaining a website as a source of advertising. This site contains general information for informative purposes only. The reader should not consider / construe information on this site to be an invitation for any attorney-client relationship.