Key Takeaways:
- Independent verification shows the viral clip is an AI-edited Yogi Adityanath deepfake rather than an authentic call for Prime Minister Modi to resign.
- Reverse image and video checks found the original footage was a UP assembly speech uploaded by ANI on 25 December 2025.
- Forensic checks found mismatched lip movements and audio; multiple AI-detection tools flagged the audio as synthetic at 98 percent probability.
- No credible news reports or primary sources corroborate the viral claim, underscoring the need to verify viral political content.
Yogi Adityanath deepfake exposed
A widely circulated video purports to show Uttar Pradesh Chief Minister Yogi Adityanath demanding that Prime Minister Narendra Modi resign if India did not attack Bangladesh. Fact checking reveals the clip is not authentic. The footage is an AI-edited deepfake that manipulates the original assembly speech audio and lip movements to produce a false claim.
Researchers and journalists traced the full, unaltered footage to a video published on the ANI Bharat YouTube channel on 25 December 2025. That recording is of a speech delivered by the chief minister during the Uttar Pradesh legislative assembly winter session. In the original speech Mr Adityanath discusses political tensions and the posture of some groups with respect to neighbouring countries, but he does not call for military action or demand the prime minister resign.
How the manipulation was detected
Forensic review of the viral clip highlighted several red flags. Analysts noted irregularities in lip movement that did not align with natural speech patterns seen in the verified assembly video. The audio track also failed to match the chief minister’s known voice profile. Independent AI-detection tools returned consistent results, indicating the audio was synthesised with approximately 98 percent probability.
Journalistic checks against contemporaneous media coverage found no reports quoting Mr Adityanath making the alleged remarks about Prime Minister Modi or advocating an attack on Bangladesh. Multiple credible news outlets that covered the UP assembly speech did not reproduce the viral claim, further supporting the conclusion that the viral clip was manipulated.
Why this matters
Political deepfakes can provoke public anger, inflame interstate tensions, and erode trust in democratic institutions. A doctored clip that appears to show a senior politician calling for violence or the resignation of a national leader can have immediate real-world consequences if amplified on social platforms without verification.
Media outlets, social platforms and the public have a role to play in slowing the spread of false material. Simple checks such as reverse image searches, checking official channels for the original footage, and consulting reliable fact-checkers can quickly reveal inconsistencies. Where technical uncertainty remains, forensic tools that analyse audio and video consistency are useful aids.
What readers should do
If you encounter similar viral political videos, verify the source before sharing. Look for the original upload on official broadcaster or government channels, compare lip movements and audio against verified footage, and consult reputable fact-checking organisations. Report suspicious content to the platform hosting it and seek updates from established news outlets.
In this case, available evidence clearly shows the claim that Yogi Adityanath demanded Prime Minister Modi step down unless India attacked Bangladesh is false. The viral clip is an AI-generated edit of a legitimate assembly speech and should not be treated as an accurate record of the chief minister’s remarks.

















