Menu
  • Home
  • Team
  • Practice Areas
        • Corporate & Commercial
        • Education
        • Real Estate
        • Intellectual Property
        • Insurance
        • Telecommunications, Satellite and Information Technology
        • Life Sciences & Healthcare
        • Litigation, Arbitration & Alternative Dispute Resolution
        • Labour and Employment
        • Media & Entertainment
        • Banking, Finance and Capital Markets
        • Licensing, Franchising and Trading
        • Outsourcing
        • Infrastructure Projects, Energy, Mining, Transportation, Water
        • Taxation
  • Newsletters
  • Awards & Conferences
  • Careers
  • Contact Us
LexUpdate
February 13, 2026 New Delhi, INDIA
India’s 2026 IT Rules on Synthetic Media: A Structural Shift in Intermediary Liability

If you have questions or would like additional information on the material covered herein, please contact:

Alishan Naqvee, Founding Partner
anaqvee@lexcounsel.in

Amir Shejeed, Associate
ashejeed@lexcounsel.in

India’s 2026 IT Rules on Synthetic Media: A Structural Shift in Intermediary Liability
I.              Introduction The Ministry of Electronics and Information Technology (“MeitY”) has notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 (“IT Amendment Rules 2026”) on 10 February 2026, effective from 20 February 2026. The amendments introduce a structured compliance framework governing AI-generated and synthetically altered content across digital platforms in India. Although positioned as a response to the growing risks of deepfakes and online harms, the amendments extend beyond content moderation. They significantly recalibrate the regulatory expectations placed on intermediaries, shifting the framework from a reactive notice-and-takedown model to one requiring proactive oversight, verification, labelling and technical safeguards. In this update, we discuss the key features and provisions of the IT Amendment Rules 2026. II.             Definitions: The IT Amendment Rules 2026 define “audio, visual or audio-visual information” and “synthetically generated information.” “(wa) ‘synthetically generated information’ means audio, visual or audio-visual information which is artificially or algorithmically created, generated, modified or altered using a computer resource, in a manner that such information appears to be real, authentic or true and depicts or portrays any individual or event in a manner that is, or is likely to be perceived as indistinguishable from a natural person or real-world event;” Importantly, the IT Amendment Rules 2026 clarify that certain activities undertaken in good faith and without the intention of creating false or misleading records shall not be treated as “synthetically generated information.” These exemptions are significant, as they preserve legitimate editorial, academic and accessibility-related uses of technology. The following activities are excluded from the scope of synthetically generated information:
  1. Routine or Good-Faith Editing and Technical Processing;
  2. Preparation of Documents and Educational or Research Materials; and
  3. Accessibility, Clarity and Discoverability Enhancements.
These exclusions demonstrate a conscious legislative effort to balance the regulation of deceptive synthetic media with the preservation of legitimate technological uses. The regulatory focus remains on deception and material misrepresentation, rather than routine digital processing. III.           Intermediary Obligations The IT Amendment Rules 2026 significantly expand the due diligence obligations of intermediaries under the existing Rule 3 framework. i.       Quarterly User Disclosures: Intermediaries are now required to inform users at least once every three months, in a simple and effective manner and in English or any Eighth Schedule language of the consequences of violating platform rules and applicable laws. These disclosures must clearly state that user access may be suspended or terminated, that unlawful conduct may attract civil or criminal liability under applicable statutes, and that certain offences may trigger mandatory reporting to law enforcement authorities. This replaces the earlier annual disclosure requirement and converts compliance communication into a recurring and structured regulatory obligation. ii.      Suo Motu Duty to Act on Unlawful Synthetic Content: The amendments introduce a proactive obligation requiring intermediaries to act where they become aware, either independently (suo motu), upon actual knowledge, or through a grievance, of synthetically generated information that violates the Rules. In such cases, intermediaries must take expeditious and appropriate action, which may include removal or disabling of access to the offending content, suspension or termination of the user account without vitiating evidence, identification and disclosure of the violating user’s identity to a complainant who is a victim (or acting on behalf of a victim) in accordance with applicable law, and mandatory reporting where the conduct constitutes a reportable offence. This represents a clear departure from a purely notice-based regime and reflects a framework that expects active intervention and regulatory responsiveness from intermediaries. iii.     Mandatory Deployment of Preventive Technical Measures: Intermediaries are required to deploy reasonable and appropriate technical safeguards, including automated tools or equivalent mechanisms, to prevent users from creating or disseminating unlawful AI-generated content. This obligation specifically extends to the prevention of child sexual exploitative material and non-consensual intimate imagery, as well as obscene, pornographic, paedophilic, vulgar, indecent or sexually explicit content, or content invasive of bodily or personal privacy; false documents or false electronic records; content facilitating explosives, arms or ammunition; and deepfakes or synthetic media that deceptively misrepresent identity, voice, conduct or real-world events. The framework thus establishes a preventive compliance architecture, moving beyond a purely reactive enforcement model. iv.     Mandatory Labelling of Lawful Synthetic Content: Where synthetically generated content is not unlawful but falls within the regulatory definition, intermediaries are required to ensure clear and prominent disclosure. In particular, visual content must carry a prominently visible label, while audio content must include a prefixed disclosure indicating that it is AI-generated. The regulatory objective is therefore transparency rather than prohibition, permitting lawful synthetic content to circulate subject to appropriate contextual warning and user awareness. v.      Embedding of Metadata and Provenance Mechanisms: Intermediaries are required, to the extent technically feasible, to embed permanent metadata or equivalent provenance mechanisms within synthetically generated content. Such metadata must include a unique identifier and information identifying the intermediary’s computer resource used to create, generate, modify or alter the content. This requirement is intended to enhance traceability and accountability, particularly in instances of misuse or regulatory investigation. vi.     Additional Disclosure Obligations for AI/Synthetic Content-Enabling Intermediaries: Intermediaries that provide computer resources enabling or facilitating the creation, generation, modification, alteration, publication, transmission, sharing or dissemination of synthetically generated information are subject to enhanced disclosure requirements. Such intermediaries must inform users that directing or using these tools in contravention of applicable law may attract civil or criminal liability under the Information Technology Act, 2000 and other relevant statutes. They must further disclose that violations may result in immediate removal or disabling of access to content, suspension or termination of user accounts without compromising evidentiary integrity, identification and disclosure of the violating user’s identity to a complainant who is a victim (in accordance with applicable law), and mandatory reporting to competent authorities where the conduct constitutes a reportable offence. IV.           Additional Burdens on Significant Social Media Intermediaries i.      Mandatory User Declaration Prior to Publication: Significant Social Media Intermediaries (“SSMIs”) are required, prior to making any content publicly available, to obtain a declaration from users stating whether the content constitutes synthetically generated information. This requirement introduces a pre-publication compliance checkpoint within the content dissemination process and places an initial disclosure obligation on users. ii.     Verification of User Declarations: SSMIs must deploy appropriate technical measures, including automated tools where necessary, to verify the correctness of user declarations regarding synthetically generated information. The ultimate responsibility for due diligence remains with the SSMI, which must independently verify such declarations in a reasonable and proportionate verification, having regard to the nature, format and source of the content. iii.    Labelling of Confirmed Synthetic Content: Where a user declaration or technical verification confirms that content constitutes synthetically generated information, SSMIs are required to ensure that such content is clearly and prominently labelled with an appropriate disclosure, indicating its synthetic or AI-generated nature. iv.    Deemed Failure of Due Diligence: An SSMI may be deemed to have failed to exercise due diligence where it is established that the intermediary knowingly permitted, promoted, or failed to act upon synthetically generated information in contravention of the Rules. This provision creates significant regulatory exposure, particularly where platforms are found to have actual or constructive knowledge of violations and underscores the heightened compliance expectations applicable to SSMIs. V.            Drastically Reduced Timelines: The IT Amendment Rules 2026 introduce substantial reductions in statutory compliance timelines, significantly increasing operational pressure.
  S. No.   Obligation   Earlier Timeline   New Timeline
 
  Lawful takedown direction   36 hours   3 hours
 
  Grievance disposal   15 days   7 days
 
  Urgent complaint handling   72 hours   36 hours
 
  Intimate image removal   24 hours   2 hours
VI.           Analysis and Conclusion The IT Amendment Rules 2026 mark a significant step toward structured AI governance in India’s digital ecosystem. By introducing proactive monitoring duties, mandatory labelling of lawful synthetic content, embedded provenance mechanisms and sharply reduced compliance timelines, the framework materially expands intermediary obligations. While earlier draft iterations contemplated more granular technical parameters for AI disclosures, the final amendment adopts a broader, principles-based labelling standard, leaving implementation to platform-level execution and potentially resulting in varied compliance approaches. The compression of statutory timelines for takedown compliance and grievance redressal to a matter of hours is likely to impose considerable operational strain, particularly on platforms handling large volumes of user-generated content. Although the amendments clarify that intermediaries will not lose safe harbour protection under Section 79 of the IT Act merely for removing content or deploying reasonable technical measures, the shift from requiring platforms to “endeavour” to deploy safeguards to mandating that they “deploy appropriate technical measures” significantly raises the compliance threshold and reinforces that intermediary immunity remains conditional upon strict adherence to the revised due diligence framework. Feedback  

Disclaimer: LexCounsel provides this e-update on a complimentary basis solely for informational purposes. It is not intended to constitute, and should not be taken as, legal advice, or a communication intended to solicit or establish any attorney-client relationship between LexCounsel and the reader(s). LexCounsel shall not have any obligations or liabilities towards any acts or omission of any reader(s) consequent to any information contained in this e-newsletter. The readers are advised to consult competent professionals in their own judgment before acting on the basis of any information provided hereby.

HEAD OFFICE (DELHI): B-4/232, Safdarjung Enclave, New Delhi, 110029, India

CHANDIGARH OFFICE: House No-81, Sector-4, Mansa Devi complex, Panchkula Haryana, 134114, India

PRAYAGRAJ OFFICE: 18 MIG Flat, Lajpat Rai Road (Near Tripathi Crossing), Mumfordganj, Prayagraj. Pin – 211002, India 

ODISHA OFFICE: D-36, Defence (AWHO) Colony Padmapani Vihar, Niladri Vihar Bhubaneswar, Odisha -751 021, India

KOLKATA OFFICE: Saket Shree, 39A, Jorapukur Square Lane (Behind Girish Park), Room # 205, Kolkata- 700006, WB, India

AHMEDABAD OFFICE: 706, 7ᵗʰ Floor, Parkview Nexus, Behind AUDI Car Showroom, S.G. Highway, Sola, Ahmedabad – 380060, India

Navigation

  • Team
  • Services
  • Awards
  • Videos
  • Careers
  • Contact Us

Follow Us

Facebook X-twitter Linkedin Medium
  • Terms Of Use
  • Privacy Policy
Copyright 2025 LexCounsel. All right reserved.

LexCounsel provides this e-update on a complimentary basis solely for informational purposes. It is not intended to constitute, and should not be taken as, legal advice, or a communication intended to solicit or establish any attorney-client relationship between LexCounsel and the reader(s). LexCounsel shall not have any obligations or liabilities towards any acts or omission of any reader(s) consequent to any information contained in this e-newsletter. The readers are advised to consult competent professionals in their own judgment before acting on the basis of any information provided hereby.

WhatsApp us