Cybersecurity – Delhi High Court Signals Relief for Shashi Tharoor in Deepfake Dispute
Cybersecurity – The Delhi High Court on Friday indicated that it would issue interim protection in favour of Congress MP Shashi Tharoor in a case related to alleged deepfake videos and manipulated online content falsely portraying him as making remarks supportive of Pakistan.

The matter was heard by a single-judge Bench led by Justice Mini Pushkarna, which issued notices to multiple respondents, including the Central government and social media companies such as Meta and X. The court directed them to submit their responses within four weeks.
Court Considers Interim Safeguards
During the proceedings, the High Court observed that interim directions would be granted in line with several requests made by Tharoor in his injunction application. The Congress leader has sought immediate measures to prevent further circulation of the disputed content and to safeguard his personality and publicity rights.
Senior advocate Amit Sibal, representing Tharoor, argued before the court that fabricated videos using artificial intelligence technology had been circulated online with politically sensitive and misleading claims attached to the parliamentarian.
According to the submissions, the videos falsely showed Tharoor making statements praising Pakistan, which the legal team claimed posed a serious threat to both his public reputation and India’s international image.
Repeated Uploads Raise Concern
The court was informed that despite multiple complaints filed under the Information Technology Rules and repeated requests made to online platforms, the disputed material continued to reappear through newly created web links.
Sibal told the Bench that the same manipulated videos were repeatedly being uploaded under different URLs even after earlier versions had been flagged or removed. He also pointed out that independent fact-checking organisations had already identified the videos as fake, yet sections of the public continued to believe the clips were genuine.
The senior lawyer further argued that misuse of a prominent public figure’s identity through AI-generated material could create wider diplomatic and political consequences.
Concerns Over Reputation and Public Perception
While presenting arguments, Sibal stated that Tharoor’s public statements carry influence due to his political stature and previous experience as a Union minister. He argued that the alleged deepfake campaign had distorted the leader’s identity and public image in a manner that could negatively affect perceptions both within India and abroad.
The legal team also expressed concern that such manipulated material could potentially be exploited by foreign entities or governments as part of organised misinformation efforts designed to damage Tharoor’s patriotic credentials and shape public opinion through false narratives.
Social Media Platforms Respond
Appearing on behalf of one of the social media intermediaries, Meta informed the court that several URLs highlighted in the petition were no longer accessible on its platforms. However, Tharoor’s side maintained that similar versions of the videos continued to surface online through fresh links and reposts.
The petition filed before the Delhi High Court seeks protection against the alleged unauthorised use of Tharoor’s name, image, likeness, and identity on digital platforms, including AI-generated and morphed content.
Growing Number of Celebrity Deepfake Cases
The case adds to a growing trend of public personalities approaching courts over misuse of identity and artificial intelligence-based impersonation. In recent months, several well-known figures from politics, cinema, sports, and entertainment have sought legal protection against unauthorised digital reproductions and fake online content.
Among those who have previously moved the Delhi High Court in similar matters are former cricketers Gautam Gambhir and Sunil Gavaskar, spiritual leader Sri Sri Ravi Shankar, actors Kajol, Aishwarya Rai Bachchan, and filmmaker Karan Johar, among others.