x

Draft intermediary amendments: Bugle call for AI governance in India?

03 November 2025

by Sameer Avasarala

The Ministry of Electronics and Information Technology’s (‘MEITY’) release of the Draft Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2025 (‘SGI Amendment’) is another spoke in the wheel of regulating internet aspects through the ‘intermediary’ route under the Information Technology Act, 2000 (‘IT Act’). The SGI Amendment is issued subsequent to an earlier set of advisories, including the MEITY Advisory dated 15 March 2024 (‘MEITY Advisory’), which propose similar yet broader requirements, relating to labelling, embedding of information generated through synthetic creation apart from other measures such as disclaimers on unreliability of models, protection against bias and discrimination, consent pop-ups and other measures.

The SGI Amendment proposes to introduce obligations on intermediaries that enable or facilitate creation, generation, modification or alteration of ‘synthetically generated information’ (‘SGI’) which is defined to include “information which is artificially or algorithmically created, generated, modified or altered using a computer resource, in a manner that such information reasonably appears to be authentic or true[1].

A mere perusal of the definition (with particular reference to information that reasonably appears to be authentic or true) as well as the Explanatory Memorandum makes it evidently[2] clear that the intention remains to regulate challenges arising from growing misuse of SGI including deepfakes, misinformation and other unlawful content. While this is typically audio-visual content, the definition does not draw such limitations, making it equally applicable to non-audio and non-visual content (such as text).

Obligations of Intermediaries

The SGI Amendment requires intermediaries which offer computer resources that may enable or facilitate creation of synthetically generated information to undertake the following, as part of their due diligence measures:

  • Labelling: Intermediaries must ensure[3] information is prominently labelled or embedded with permanent unique metadata or identifiers which are visibly displayed or made audible (as the case may be) in a prominent manner on or within such data. Such label, metadata or identifiers must be used to immediately identify that such content is SGI and must:
  • Cover at least 10% (ten percent) of the surface area of a visual display; or
  • Occur in the initial 10% (ten percent) of audio duration.

 

Apart from the practical difficulties in applying such labels across different content types, considering that the ambit of SGI includes content modified or altered using AI-generated means, innocuous modifications or alterations (such as filters) may also be subject to labelling requirements outlined under the SGI Amendment.

  • Suppression of labels: Intermediaries which offer computer resources to facilitate creation of SGI are also required to ensure that they shall not enable the modification, suppression or removal of labels, metadata or identifiers used to label content as SGI.

In this regard, the SGI Amendment clarifies that intermediaries which become aware or those that have knowingly permitted, promoted or failed to act upon such SGI or to take such measures in respect of the same contravening these rules would not be extended the ‘safe harbour’[4] protection under the Information Technology Act, 2000.

Obligations of Significant Social Media Intermediaries

In addition to the general obligations applicable to intermediaries, Significant Social Media Intermediaries (‘SSMIs’) are obligated to ensure:

  • User Declarations: SSMIs which enable uploading or publishing of information must take appropriate measures to enable users to declare content uploaded or published by them that contains SGI content, and where any declaration confirms use of such SGI, SSMIs must also ensure compliance with labelling and notice requirements indicating that such content is synthetically generated.

 

  • Technical Measures: SSMIs are required to ensure[5] deployment of reasonable and appropriate technical measures, including automated tools and other mechanisms to verify the accuracy of such declarations, having regard to nature, format and source of such information. The responsibility of such SSMIs shall extend to verifying correctness of such declarations and ensuring that no SGI is published without such declarations.

 

The efficacy of such measures must be examined considering the volume and scale of content uploads handled by SSMIs. Further, it must be ensured that any automated tools deployed for content tagging and/or identification linked with user declarations must factor in and balance SSMI’s prerogative of retaining safe harbour protection while undertaking reasonable content moderation.

The SGI Amendment marks a significant step towards regulating harms arising from the use of artificially generated content presented as true or accurate information. While the measures proposed under the SGI Amendment address concerns specific to such content, earlier guidance (such as the Advisory) recommended a broader set of safeguards. Although the SGI Amendment represents a decisive move within a series of regulatory developments aimed at mitigating risks associated with generative AI content, it remains unclear whether this can be regarded as the first step towards a comprehensive framework for regulating the use and deployment of artificial intelligence in India.

[The author is a Principal Associate in Technology Law Team at Lakshmikumaran & Sridharan Attorneys, Hyderabad]

 

[1] Rule 2(1)(wa), SGI Amendment, 2025.

[2] Explanatory Note, Proposed Amendments dated October 22, 2025

[3] Rule 3(3), SGI Amendment, 2025.

[4] Section 79, Information Technology Act, 2000.

[5] Rule 4(1A), SGI Amendment, 2025.

Browse articles