OpenAI is pushing harder into agentic work, not just chat. On the company's own evals, GPT-5.5 reaches 82.7% on Terminal-Bench 2.0, beats GPT-5.4 by 7.6 points, and uses fewer tokens in Codex.
#codex
RSS FeedOpenAI is pitching GPT-5.5 as more than a routine model refresh. With 82.7% on Terminal-Bench 2.0, 58.6% on SWE-Bench Pro, and a claim that it keeps GPT-5.4-level latency, the company is resetting expectations for long-running coding agents.
This is a distribution story, not just a usage milestone. OpenAI says Codex grew from more than 3 million weekly developers in early April to more than 4 million two weeks later, and it is pairing that demand with Codex Labs plus seven global systems integrators to turn pilots into production rollouts.
The bottleneck moved from GPUs to the API layer, and OpenAI changed the transport to keep up. By adding WebSocket mode and connection-scoped caching to the Responses API, the company says agentic workflows improved by up to 40% end-to-end and GPT-5.3-Codex-Spark reached 1,000 tokens per second with bursts up to 4,000.
OpenAI says more than 3 million developers use Codex each week, and the desktop app is now moving beyond code edits. The update adds background computer use on macOS, an in-app browser, gpt-image-1.5 image generation, 90+ new plugins, PR review workflows, SSH devboxes in alpha, automations, and memory preview.
HN read Codex less as a feature list and more as a permission problem. The thread kept circling desktop agents, non-developer workflows, sensitive files, and whether users really want an AI operating their computer.
OpenAI is turning Codex from a coding workspace into a broader desktop agent. The thread says Codex can use Mac apps, create images, remember work preferences, and connect through 90+ plugins.
GitHub is making third-party coding agents less static: Claude and Codex users on github.com can now choose among 4 Anthropic models and 3 OpenAI models when they launch a task. That matters because model choice changes latency, spend, and code quality far more than a small UI toggle suggests.
Enterprise AI teams are discovering that model quality is only half the problem. OpenAI's Cloudflare Agent Cloud tie-up is about collapsing model access, state, storage, and tool execution into one production path instead of another demo pipeline.
OpenAI said on March 31, 2026 that it closed a $122 billion funding round at an $852 billion post-money valuation. The company tied the raise to faster compute expansion, enterprise growth, and a unified AI superapp strategy spanning ChatGPT, Codex, and broader agent workflows.
OpenAI said it closed a $122 billion funding round on March 31, 2026 at an $852 billion post-money valuation. The company tied the raise to compute expansion, product development, and deeper enterprise and developer adoption.
OpenAI said on April 10, 2026 that a compromised Axios package touched a GitHub Actions workflow used in its macOS app-signing pipeline. The company says no user data, systems, or software were compromised, but macOS users need updated builds signed with a new certificate before May 8, 2026.