The Future of AI Filmmaking Is a Parody of the Apocalypse, Made by a Guy Named Josh
Briefly

The Future of AI Filmmaking Is a Parody of the Apocalypse, Made by a Guy Named Josh
"The filmmaker could not get Tiggy the alien to cooperate. He just needed the glistening brown creature to turn its head. But Tiggy, who was sitting in the passenger's seat of a cop car, kept disobeying. At first Tiggy rotated his gaze only slightly. Then he looked to the wrong side of the camera. Then his skin turned splotchy, like an overripe fruit."
"He'd used a different AI tool, Midjourney, to generate the very first image of Tiggy (prompt: "fat blob alien with a tiny mouth and tiny lips"); one called ElevenLabs to create the timbre of Tiggy's voice (the filmmaker's voice overlaid with a synthetic one, then pitch-shifted way up); and yet another called Runway to describe the precise shot he wanted in this scene."
An independent filmmaker used multiple AI tools to create and animate an alien character named Tiggy, generating visuals, voice, and shot descriptions from home. Image models produced inconsistent results—incorrect gaze, distorted anatomy, splotchy textures, and unintended amphibian features—necessitating numerous regenerations. Voice synthesis combined the filmmaker's voice with a synthetic layer and pitch shifting. Safety filters and content rules blocked requests like 'short shirtless alien', causing errors and workaround efforts. The creative process involved iterative prompting across platforms (Midjourney, ElevenLabs, Runway, FLUX Kontext) and highlighted both playful experimentation and technical limitations of current consumer AI tools.
Read at WIRED
Unable to calculate read time
[
|
]