FALSE
Multiple "cheapfake" videos - using simple editing techniques like clipping, speed manipulation, and recontextualization - misrepresented PM Modi's statements in 2025. Unlike sophisticated deepfakes, these required no AI, just basic video editing. BOOM Live and Alt News documented and debunked numerous examples throughout the year.
While attention often focuses on AI-generated deepfakes, 2025 saw a proliferation of "cheapfakes" targeting PM Modi - manipulated videos using simple editing techniques. These included clips taken out of context, slowed-down footage suggesting impairment, sped-up clips altered to change meaning, and audio dubbed over genuine footage. Indian fact-checkers documented dozens of such manipulations, finding the low-tech approach often more effective at spreading than sophisticated AI fakes, partly because they're faster to produce and harder to detect.
What Are Cheapfakes?
Unlike deepfakes requiring AI, cheapfakes use simple editing: clipping quotes, changing speed, adding false subtitles, or combining unrelated audio and video [1].
These techniques are accessible to anyone with basic video editing software, making them far more prevalent than AI deepfakes [9].
2025 Examples
BOOM Live documented clips that removed context from Modi's speeches, making statements appear to say the opposite of their meaning [1].
Alt News exposed videos with manipulated audio, where different speech was dubbed over footage [2].
Detection Methods
Fact-checkers use reverse video search, comparing clips to original broadcasts and official recordings [3].
Audio analysis can reveal dubbing, while metadata examination often shows editing traces [7].
Conclusion
Cheapfakes targeting Modi in 2025 demonstrate that low-tech manipulation remains a major misinformation vector. These techniques predate AI and remain effective because they're quick to produce and can be harder to detect than obvious deepfakes. Always verify political clips against original sources before sharing.