Sora and the Sloppy Future of AI
Briefly

Sora and the Sloppy Future of AI
"In testing, it felt, at most, like a chance to try out recent clip-generation tech for free. Otherwise, the feed was unfocused and confusing, full of overcooked generations by strangers unified only by the drab meta-aesthetics that have come to define AI imagery over the last couple of years: CGI-ish scenery and characters; real places and people rendered as uncanny stock photography; the work of Thomas Kinkade if he got into microdosing mushrooms."
"In hindsight, though, the release of Vibes now makes a little more sense. Meta, which has been spending massive amounts of money to poach talent from other AI firms, probably knew a great deal about what OpenAI was about to release: a new version of its video-generation product, Sora, this time packaged as a TikTok-style app, dropped on Tuesday. In contrast with Vibes, Sora - an invite-only app with limited access - was an instant viral hit."
"What's the difference? Underlying models matter a little bit here. In an apparent rush to get the app out, and lacking better tech of its own, Meta ended up leaning on an outside image-and-video-generation company. Meanwhile, the latest OpenAI model is obviously more capable of producing what you ask it for, whether that's a fairly realistic clip of a real person doing something normal, a jokey visual mash-up - or something stranger or more illicit, despite OpenAI's attempts to include a wide range of guardrails."
Meta launched Vibes, a short-form AI-generated video feed in the Meta AI app overseen by Alexandr Wang. Early testing produced an unfocused, confusing feed dominated by overcooked generations, CGI-ish scenery, uncanny stock-photo people, and a drab meta-aesthetic. Meta relied on an outside image-and-video-generation company rather than in-house tech. OpenAI released Sora, a TikTok-style video-generation app that became a viral hit. OpenAI's latest model produced more accurate, varied, and compelling clips, from realistic human actions to surreal mash-ups. OpenAI included extensive guardrails, but early social examples still showed macabre or illicit outputs.
Read at Intelligencer
Unable to calculate read time
[
|
]