
OpenAI's latest upgrade makes Codex agent think longer, fix better
What's the story
OpenAI has announced an upgrade to its AI coding agent, Codex, with the introduction of a new version of GPT-5. The updated model, dubbed GPT-5-Codex, is designed to be more efficient in its "thinking" process than previous iterations. Depending on the complexity of a coding task, it can take anywhere from a few seconds to as long as seven hours to complete. This dynamic approach has led to improved performance on agentic coding benchmarks.
Accessibility
How to access the new Codex model
The upgraded model is being rolled out in Codex products, accessible via terminal, IDE, GitHub or ChatGPT. All ChatGPT Plus, Pro, Business, Edu and Enterprise users can access this new version. OpenAI has also said that it plans to make the model available for API customers in the future. This move is part of OpenAI's strategy to compete with other AI coding products like Claude Code and Microsoft's GitHub Copilot.
Performance
Improvements in code reviews and refactoring tasks
OpenAI claims that GPT-5-Codex outperforms its predecessor, GPT-5, on SWE-bench Verified. This benchmark assesses agentic coding abilities. Additionally, performance on code refactoring tasks from large, established repositories is evaluated by a separate benchmark. The company also trained GPT-5-Codex for conducting code reviews and asked experienced software engineers to evaluate the model's review comments. The engineers reportedly found that the new model made fewer incorrect comments while adding more "high-impact comments."
Innovation
The dynamic 'thinking' abilities of GPT-5-Codex
Alexander Embiricos, OpenAI's Codex product lead, revealed that much of the enhanced performance comes from GPT-5-Codex's dynamic "thinking abilities." Unlike a router that decides how much computational power and time to use on a problem at the outset, GPT-5-Codex can adjust its approach in real-time. This means it can determine five minutes into a problem that it needs to spend more time on it.