OpenAI Rolls Back ChatGPT's Model Router System for Most Users
Briefly

OpenAI Rolls Back ChatGPT's Model Router System for Most Users
"On a low-profile blog that tracks product changes, the company said that it rolled back ChatGPT's model router-an automated system that sends complicated user questions to more advanced "reasoning" models-for users on its Free and $5-a-month Go tiers. Instead, those users will now default to GPT-5.2 Instant, the fastest and cheapest-to-serve version of OpenAI's new model series. Free and Go users will still be able to access reasoning models, but they will have to select them manually."
"The model router launched just four months ago as part of OpenAI's push to unify the user experience with the debut of GPT-5. The feature analyzes user questions before choosing whether ChatGPT answers them with a fast-responding, cheap-to-serve AI model or a slower, more expensive reasoning AI model. Ideally, the router is supposed to direct users to OpenAI's smartest AI models exactly when they need them."
"In practice, the router seemed to send many more free users to OpenAI's advanced reasoning models, which are more expensive for OpenAI to serve. Shortly after its launch, Altman said the router increased usage of reasoning models among free users from less than 1 percent to 7 percent. It was a costly bet aimed at improving ChatGPT's answers, but the model router was not as widely embraced as OpenAI expected."
OpenAI rolled back the model router for Free and $5 Go users, making GPT-5.2 Instant the default model while keeping manual access to reasoning models. The router had been introduced four months earlier with GPT-5 to route queries between fast, low-cost models and slower, more expensive reasoning models. The router increased free-user routing to reasoning models from under 1 percent to 7 percent, raising serving costs. Reasoning models can take minutes and higher compute, and that latency appears to have reduced daily active users as many consumers prefer speed over longer, more costly answers.
Read at WIRED
Unable to calculate read time
[
|
]