Centre mandates social platforms remove harmful content within 3 hours
What's the story
The Ministry of Electronics and Information Technology (MeitY) has amended the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The new regulations bring "synthetically generated information" (SGI), including deepfakes, under their purview. They require intermediaries to ensure that AI-generated or altered content is labeled or identified through visible disclosures or embedded metadata.
Disclosure requirement
Users must be informed if content is synthetically created
The amended IT rules also mandate that users are informed when content has been synthetically created or modified. This comes after industry bodies such as IAMAI, Nasscom, and the Business Software Alliance raised concerns over the draft rules being too broad. They warned it could cover routine digital edits along with harmful deepfakes and called for a harm-based approach to defining SGI.
Time limits
Addressal timelines tightened
The government has sharply reduced the time social media platforms have to act on illegal or harmful content. Earlier, companies had up to 36 hours to remove flagged material; they must now take it down within three hours. Grievance timelines have also been tightened. Platforms are required to acknowledge user complaints within two hours, compared to the previous 24-hour limit, and resolve them within seven days instead of 15.
Definition debate
Industry bodies call for harm-based approach in defining SGI
The industry bodies have urged MeitY to refine the definition of SGI to only include deceptive or misleading content. This comes as part of a wider debate on how best to regulate AI-generated content without stifling innovation or creativity in digital media. The new rules are a step toward that balance, but their implementation will be closely watched by all stakeholders involved.