OpenAIDevs Says GPT-5.3-Codex Is Now Available to All Developers in the Responses API

Original: GPT-5.3-Codex is now available for all developers in the Responses API. View original →

Read in other languages: 한국어日本語
LLM Feb 26, 2026 By Insights AI 1 min read 10 views Source

What was announced on X

In an official post on 2026-02-24, OpenAIDevs said: "GPT-5.3-Codex is now available for all developers in the Responses API." The post links directly to the model documentation page at developers.openai.com/api/docs/models/gpt-5.3-codex, signaling that this is a production availability update rather than only a preview note.

The timing matters. In OpenAI's earlier launch write-up for GPT-5.3-Codex, the company described API access as something it was working to enable safely. This new X post is therefore a concrete lifecycle milestone: API availability has moved from "soon" to "available now" for all developers using the Responses API.

How OpenAI framed GPT-5.3-Codex

OpenAI's product page for GPT-5.3-Codex describes the model as a stronger agentic coding system that combines frontier coding performance with broader reasoning and professional knowledge capabilities. The same write-up states that the model is 25% faster than GPT-5.2-Codex and reports benchmark gains across coding and computer-use evaluations.

  • SWE-Bench Pro (Public): 56.8%
  • Terminal-Bench 2.0: 77.3%
  • OSWorld-Verified: 64.7%

Those numbers are vendor-reported figures from OpenAI's launch material, but they provide useful context for why the API rollout is meaningful for engineering teams choosing a default coding model.

Why this rollout is high-signal for developers

For teams already standardized on the Responses API, this announcement reduces integration friction: the model called out in OpenAI's own benchmark and launch narrative is now directly available through the same API surface many teams already use in production. That usually translates into simpler migration planning, faster A/B evaluation cycles, and clearer cost-performance testing against existing coding-agent stacks.

Primary sources: X post, model documentation, OpenAI launch write-up.

Share:

Related Articles

LLM sources.twitter 1d ago 2 min read

OpenAI Developers published a March 11, 2026 engineering write-up explaining how the Responses API uses a hosted computer environment for long-running agent workflows. The post centers on shell execution, hosted containers, controlled network access, reusable skills, and native compaction for context management.

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.