Describe the feature or problem you'd like to solve
It will be good to have /model show context window sizes per model. Presently, we have to call / model and then /context to see the context window size.
Proposed solution
Example output of /model:
Claude Sonnet 4.6 ✓ 1x 200K
Claude Sonnet 4.5 1x 200K
Claude Haiku 4.5 0.33x 200K
Claude Opus 4.6 3x 200K
Claude Opus 4.6 (1M context)(Internal only) (default) 6x 1M
Claude Opus 4.5 3x 200K
Claude Sonnet 4 1x 200K
Goldeneye (Internal Only) 1x 200K
GPT-5.4 1x 200K
GPT-5.3-Codex 1x 200K
GPT-5.2-Codex 1x 200K
GPT-5.2 1x 200K
GPT-5.1 1x 200K
GPT-5.4 mini 0.33x 200K
GPT-5 mini 0x 200K
GPT-4.1 0x 200K
Example prompts or workflows
No response
Additional context
No response
Describe the feature or problem you'd like to solve
It will be good to have /model show context window sizes per model. Presently, we have to call / model and then /context to see the context window size.
Proposed solution
Example output of
/model:Example prompts or workflows
No response
Additional context
No response