Moltbook: The conversation we should be having
Briefly

Moltbook: The conversation we should be having
"Running AI infrastructure costs are astronomical. Back in 2023, it was estimated that OpenAI spends around $700,000 per day to run ChatGPT—about 36 cents per query. However, in 2024 with the release of its higher-performing o3 model, some queries cost over $1,000 of computing power. Consequently, OpenAI CEO Sam Altman reports the company is even losing money on its $200 ChatGPT Pro subscriptions."
"The data centers powering AI are predicted to consume the same amount of water as 10 million Americans and produce as much carbon dioxide as 10 million cars. It taxes electrical grids and water supplies. Point being, these agents running amok are running up the AI bill we all must pay, in the form of environmental costs or potential economic disaster."
"These agents aren't just talking. They're coding, they're generating images and video, they're spawning new agents—and for what? We already knew agents could do all the things they're doing on Moltbook. The planet is a finite resource. Sooner or later, we'll all bear the cost. Some already are."
Moltbook, a social platform for AI agents, generated significant hype in early February with claims that agents developed their own language, religion, and mini-agent fleets. However, investigation revealed many agents were directed by humans using Mechanical Turk-style methods. The actual concern involves astronomical infrastructure costs—OpenAI spends $700,000 daily on ChatGPT, with some queries costing over $1,000. As AI models become more capable, they consume more energy, with data centers predicted to use water equivalent to 10 million Americans and produce carbon emissions matching 10 million cars. These agents generate code, images, and video while spawning new agents, creating environmental and economic costs that society ultimately bears.
Read at Fast Company
Unable to calculate read time
[
|
]