ChatGPT has launched GPT-5.3-Codex-Spark, an ultra-fast model specifically designed for real-time coding. This model is currently available as a research preview for Pro users through the Codex app, Codex CLI, and IDE extension.
Key Features:
- Faster Performance: GPT-5.3-Codex-Spark is built on a shared production stack, complementing GPUs for critical low-latency workloads.
- Optimized Infrastructure: Enhancements include improved response streaming, accelerated session initialization, and restructured parts of the inference stack, which will benefit all Codex models in the coming weeks.
- Contextual Capabilities: It is currently a text-only model featuring a 128k context window, with future plans to expand into larger models and multimodal input.
- Early Access Program: A select group of API customers has been granted early access to Codex-Spark for product experimentation and optimization feedback.
This launch is significant for builders and developers who require faster coding solutions and enhanced performance for their applications. As access expands, more users will be able to leverage these advancements for their coding tasks, leading to increased productivity and efficiency in developer workflows.