GPT-5.3-Codex being routed to GPT-5.2

2 min read Original article ↗

What version of Codex CLI is running?

codex-cli 0.99.0

What subscription do you have?

ChatGPT Pro

Which model were you using?

gpt-5.3-codex

What platform is your computer?

Linux 6.6.87.2-microsoft-standard-WSL2 x86_64 x86_64

What terminal emulator and version are you using (if applicable)?

No response

What issue are you seeing?

Both config.toml and TUI are set for gpt-5.3-codex, but the output and SSE captures show that the model name is actually gpt-5.2-2025-12-11.

What steps can reproduce the bug?

Set both config.toml and TUI to gpt-5.3-codex
Run RUST_LOG='codex_tui::chatwidget=info,codex_api::sse::responses=trace' codex
Send a prompt
log/codex-tui.log shows response.model is gpt-5.2-2025-12-11 from event response.created

What is the expected behavior?

To be served by gpt-5.3-codex, and to be notified when i am being rerouted to another model!

Additional information

I passed the verification process here: https://chatgpt.com/cyber
Thread ID: 019c50de-5329-7261-b30c-ac212608cab7

This is seriously affecting my productivity.
I'm paying for the Pro plan which includes access to 5.3, my config is set to 5.3, the UI shows 5.3, but I'm getting 5.2.
I've already lost time working under the wrong model without knowing it.
If I'm not notified, I have no agency over my own workflow.
I'm making decisions, reviewing code, and building on output that I believe comes from 5.3, when it's actually 5.2.
By the time I discover the switch, I've already committed time and work I can't get back.
If you need to reroute me, at minimum tell me so I can decide whether to continue, pause, and escalate to support.