WIRED found over 100 YouTube channels using AI to create lazy fan-fiction-style videos. Despite being obviously fake, there’s a psychological reason people are falling for them.
It entertains its audience simply with an AI voice-over, narrating an LLM-written script laden with clichés as theatrical as “fist-clenching” and “jaw wobbling.” It’s cheap, lazy—the very definition of slop—but somehow, the channel it’s hosted on, Talk Show Gold, has managed to round up over 88,000 subscribers, many of whom express complete disbelief when eventually informed by other commenters that what they are watching is “fake news.” · “These videos are what we might call ‘cheapfakes’ rather than deepfakes, as they’re cobbled together from a motley selection of real images and videoclips, with a basic AI voice-over and subtitles,” explains Simon Clark, a cognitive psychologist at the University of Bristol, who specializes in AI-generated misinformation.Except Mark Wahlberg hasn’t been a guest on The View since 2015. The inevitable twist? None of this happened in reality, but rather elapsed over the course of a 25-minute-long fan-fiction-style video, made with the magic of artificial intelligence to potentially fool 460,000 drama-hungry viewers.It’s obvious who the right-leaning, older audience is primed to relate to—who serves as the visual fic’s Mary Sue. There’s an undeniable political element at play when it comes to who is targeted, with videos focusing exclusively on political figures also constituting their own subgenre.For Clark, the talk-show videos are a clear appeal to incite moral outrage—allowing audiences to more easily engage with, and spread, misinformation. “It’s a great emotion to trigger if you want engagement. If you make someone feel sad or hurt, then they’ll likely keep that to themselves.