
or
Lex Witness in association with the Trade & Regulatory Compliance Practice Desk at Saikrishna & Associates brings to you a detailed analysis on select updates and notifications.
In a landmark judgement, the Bombay High Court, on 26th September 2024, struck down the amendment to Rule 3(1)(b)(v) of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules (“IT Rules 2021”) and declared it as unconstitutional. The amendment to Rule 3(1)(b)(v) was notified on 6th April 2023 as per which an intermediary would have to make reasonable efforts by itself or cause its users to not host, display, upload, modify, publish, transmit, store, update or share any information in respect of the business of the Central Government that would have been identified as fake/false/misleading by a fact check unit established by the Central Government. The final decision was pronounced pursuant to the opinion of the third judge to whom the matter was referred after the split decision of the Division Bench of the High Court.
By way of brief background, on 10th April 2023, a stand-up comedian, Kunal Kamra, filed a writ petition challenging the constitutionality of the amended Rule 3(1)(b)(v) of the IT Rules 2021. Thereafter, the Editors Guild of India and the Association of India Magazines also filed separate writ petitions challenging the amendment to Rule 3(1)(b)(v). As per the writ petitions, the amended Rule 3(1)(b)(v) violated the right to freedom of speech and expression granted under Article 19(1)(a), the right to practise any profession granted under Article 19(1)(g) and equality before law under Article 14 of the Indian Constitution and was ultra vires Section 79 of the Information Technology Act, 2000 (“IT Act”).
The matter was heard by the Division Bench of the Bombay High Court which delivered a split verdict regarding the constitutionality of the amended rule. Justice G.S. Patel held that the amended Rule 3(1)(b)(v) is unconstitutional under Articles 14, 19(1)(a), and 19(1)(g) of the Constitution of India, as well as ultra vires under Section 79 of the IT Act. On the other hand, the second judge, Justice Dr. Neela Gokhale upheld its validity. The observations of the two judges are briefly noted below:
Justice A.S. Chandurkar, serving as the reference judge, confined his opinion to the points of divergence between the Division Bench judges’ verdicts. He agreed with Justice Patel’s ruling and affirmed that the right to freedom of speech does not encompass a right to truth, nor does it impose a duty on the State to guarantee that citizens receive information deemed non-fake, false, or misleading as determined by the Fact Check Unit. Since the restrictions pertaining to falsity were not recognised under Article 19(2), the amended rule placed unreasonable restrictions on the fundamental right guaranteed under Article 19(1)(a). As regards the violation of Articles 14 and 19(1)(g), Justice Chandurkar sided with the view taken by Justice Patel noted above. Justice Chandurkar further opined that the amended Rule 3(1)(b)(v) was ultra vires the IT Act because it was not presented to Parliament in accordance with the requirements outlined in Section 87 of the IT Act. Further, the amended rule creates a substantive law that exceeds the authority of the parent statute.
In addition to the above, the decisions also discussed the contours of the expression “knowingly and intentionally” and “fake or false or misleading” in the context of the amended Rule 3(1)(b)(v). Justice Chandurkar also held that the amended Rule cannot be saved either by reading it down or on the basis of any concession made in that regard.
Given the totality of the observations, the amended Rule 3(1)(b)(v) also results in a chilling effect in respect of an intermediary.
After considering the opinion from the third judge, the Division Bench of the Bombay High Court delivered its final judgment on 26th September 2024 wherein it declared the amendment dated 6th April 2023 to Rule 3(1)(b)(v) of the IT Rules 2021 as unconstitutional and struck the same down.
This judgement to strike down the amendment to Rule 3(1)(b)(v) is monumental as it upholds the fundamental rights guaranteed under the Constitution. The IT Act allows the Government to take action against any information that is found to be inappropriate or unlawful. For instance, an intermediary is already required to take down content upon receiving actual knowledge, or on being notified by the Government or its agency that any information is being used to commit the unlawful act. Further, the Central Government can issue directions for blocking public access to any information if it is satisfied that it is necessary or expedient to do so, in the interest of inter alia the sovereignty and integrity of India, security of the State, public order or for preventing incitement to the commission of any cognizable offence etc. These measures vitiate the need for establishing an FCU for it to identify false information regarding the business of the Central Government, especially given the lack of an explanation from the government regarding what it means by the expression “business of the Central Government”.
Having said that, this decision is a setback for the Central Government and it may file an appeal before the Supreme Court challenging this decision of the Bombay High Court.
On 21st November 2024, the Delhi High Court issued an order directing the Central Government to nominate members to the committee constituted by the Government to address issues pertaining to deepfakes.
To provide context, the Delhi High Court has been hearing two petitions namely Chaitanya Rohilla v. Union of India [W.P.(C) 15596/2023] and Rajat Sharma v. Union of India [W.P.(C) 6560/2024], filed against the unregulated proliferation of deepfake technology. In these pleas, the petitioners had requested the Court to direct the Ministry of Electronics and Information Technology (“MeitY”) to identify and block platforms and mobile apps that facilitate the creation of deepfakes. In October 2024, the Delhi High Court asked the Government to furnish a status report providing the details of the Government’s efforts to tackle deepfakes and inform the Court about any committees formed to recommend solutions to this issue.
In compliance with the Court’s direction, the MeitY submitted a status report dated 21st November 2024 informing the Court that the MeitY, inter alia, in 2023, constituted an ‘Advisory Group on AI for India-Specific Regulator AI Framework’ and also constituted a sub-committee for the development of AI governance guidelines. Further, on 20th November 2024, the MeitY constituted a ‘Committee on matters related to the issue of deepfakes’.
The petitioners submitted that each day’s delay in the detection and removal of deepfakes causes significant hardship to the general public and also requested that their suggestions be considered by the Committee on deepfakes.
The Delhi High Court agreed with the petitioners and directed the Government to nominate names to the Committee on deepfakes and that the Committee must examine and consider the suggestions of the petitioners, the regulations and the statutory framework in other countries, and also invite and hear the experiences and suggestions of a few stakeholders such as intermediary platforms, telecommunication service providers, victims of deepfakes and the websites that deploy deepfake technologies. The Court also directed MeitY to submit its report as expeditiously as possible and preferably by February 2025.
There has been a global increase in the use of AI-generated deepfake content and India is no exception. In the past two years, India has witnessed a significant surge in deepfake cases. As per a McAfee survey, 75% of Indians present online and surveyed by them have viewed deepfake content between 2023 and 2024 and as per McAfee’s Global Festive Shopping Survey, 45% of Indian respondents have been subjected to deepfake shopping scams.
Various cases have been filed before the courts pertaining to the production of deepfake videos of celebrities. Even regulatory bodies have been targeted. Recently, the Reserve Bank of India issued a press release informing people about the fake videos, featuring its top officials, circulating on social media.
The Government has been cognisant of these instances and has been issuing advisories regularly, particularly to social media intermediaries, citing the provisions of the extant law that would be violated in the creation, dissemination, and hosting of such deepfake videos.
Interestingly, the Government had in 2023 taken steps towards specifically regulating the dissemination of deepfakes. As per a press release dated 23 November 2023, MeitY had held a meeting on deepfake where Minister Railways, Communications and Electronics & IT, Mr. Ashwini Vaishnaw, had interacted with representatives from academia, industry bodies and social media companies on the need to ensure an effective response to deepfake. During the meeting, MeitY and the stakeholders had identified 4 pillars on which action needs to be taken namely (i) ‘Detection’ of content before and after posting the content, (ii) ‘Prevention’ of propagation of deepfake content, (iii) making available effective and expeditious ‘Reporting’ and grievance redressal mechanism, and (iv) creating mass ‘Awareness’ on the issue of deepfake. Further, as per the press release, MeitY had also commenced the assessment and drafting of regulations needed to curb deepfakes, with immediate effect and sought to invite public comments on the MyGov portal.
While, in 2024, there were also reports of the Government’s plans to introduce the “Digital India Bill”, which would replace the Information Technology Act, 2000, to regulate AI-generated deepfake videos, the Bill or its draft has not seen the light of the day. Given the uncertainty around the introduction of the Bill, there is a need to examine the issue of deepfakes and curb the dissemination of misinformation. This need has been acknowledged by the Delhi High Court as well which has pushed the Government to take tangible steps to understand matters related to deepfakes and also involve all stakeholders including the intermediaries on whose platform the content would be published.
Given the involvement and the order of the Delhi High Court, it would be interesting to see how the Committee on deepfakes balances the suggestions of all the stakeholders while providing its recommendations on matters related to deepfakes and if the Government ultimately adopts and takes action on the recommendations of the Committee set up by the MeitY.
On 6th September 2024, the Department of Pharmaceuticals (“DoP”) issued the Uniform Code for Marketing Practices in Medical Devices, 2024 (“Code”). This Code mandates all Medical Device Associations (“MDAs”) to disseminate its provisions to their respective members, ensuring strict adherence.
Furthermore, the DoP directed MDAs to establish Ethics Committees for Marketing Practices in Medical Devices. These committees are tasked with uploading the Code onto the association’s websites, along with detailed procedures for filing complaints. These complaint mechanisms will be linked to the DoP’s Uniform Code for Pharmaceutical Marketing Practices (“UCPMP”) Portal. The Code is self-regulatory in nature and will be implemented via industry associations.
By formulating and notifying the UCMP-MD, the DoP has recognised the need for regulating medical devices as a sui generis category that is distinct from pharmaceutical products.
From a consumer protection perspective, the Code is laudable since it inter alia mandates that medical devices can only be promoted after acquiring necessary approvals (wherever applicable) by the regulatory authority and imposes a duty on medical device companies to ensure that their marketing claims regarding the medical device are truthful and accurate. To promote consumer awareness, the Code requires MDAs to publish the Code, complaint details, and action taken on such complaints on their respective websites.
The Code seeks to promote transparency in the industry-HCP interactions by allowing MDAs to fund hospitality-related expenses for HCPs if they are going as a speaker for continued medical education, continued professional training, etc. Further, all MDAs are required to share details of such events including the expenses undertaken on their website which may be subject to an audit.
Institutionalization of the grievance redressal mechanism through the Ethics Committee and the Apex Committee is also noteworthy. However, the limited punitive powers of the Ethics Committee to enforce the Code is a drawback and would likely not discourage MDAs from violating the Code. It remains to be seen whether the Code is successful in inculcating a culture of compliance in the medical device industry.
The LW Bureau is a seasoned mix of legal correspondents, authors and analysts who bring together a very well researched set of articles for your mighty readership. These articles are not necessarily the views of the Bureau itself but prove to be thought provoking and lead to discussions amongst all of us. Have an interesting read through.
Lex Witness Bureau
Lex Witness Bureau
For over 10 years, since its inception in 2009 as a monthly, Lex Witness has become India’s most credible platform for the legal luminaries to opine, comment and share their views. more...
Connect Us:
The Grand Masters - A Corporate Counsel Legal Best Practices Summit Series
www.grandmasters.in | 8 Years & Counting
The Real Estate & Construction Legal Summit
www.rcls.in | 8 Years & Counting
The Information Technology Legal Summit
www.itlegalsummit.com | 8 Years & Counting
The Banking & Finance Legal Summit
www.bfls.in | 8 Years & Counting
The Media, Advertising and Entertainment Legal Summit
www.maels.in | 8 Years & Counting
The Pharma Legal & Compliance Summit
www.plcs.co.in | 8 Years & Counting
We at Lex Witness strategically assist firms in reaching out to the relevant audience sets through various knowledge sharing initiatives. Here are some more info decks for you to know us better.
Copyright © 2020 Lex Witness - India's 1st Magazine on Legal & Corporate Affairs Rights of Admission Reserved