194 8 months ago

Building upon Mistral Small 3.2 (2506) adds state-of-the-art vision understanding and enhances long context capabilities up to 128k tokens without compromising text performance. With 24 billion parameters, this model achieves top-t

vision tools
ollama run SimonPu/Mistral-Small-3.2:Q4_K_XL

Applications

Claude Code
Claude Code ollama launch claude --model SimonPu/Mistral-Small-3.2:Q4_K_XL
Codex
Codex ollama launch codex --model SimonPu/Mistral-Small-3.2:Q4_K_XL
OpenCode
OpenCode ollama launch opencode --model SimonPu/Mistral-Small-3.2:Q4_K_XL
OpenClaw
OpenClaw ollama launch openclaw --model SimonPu/Mistral-Small-3.2:Q4_K_XL

Models

View all →

Readme

Mistral-Small-3.2-24B-Instruct-2506

Mistral-Small-3.2-24B-Instruct-2506 is a minor update of Mistral-Small-3.1-24B-Instruct-2503.

Small-3.2 improves in the following categories: - Instruction following: Small-3.2 is better at following precise instructions - Repetition errors: Small-3.2 produces less infinite generations or repetitive answers - Function calling: Small-3.2’s function calling template is more robust (see here and examples)

In all other categories Small-3.2 should match or slightly improve compared to Mistral-Small-3.1-24B-Instruct-2503.

Key Features