Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
mixtral
:8x7b-instruct-v0.1-q4_K_M
1.3M
Downloads
Updated
8 months ago
A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.
A set of Mixture of Experts (MoE) model with open weights by Mistral AI in 8x7b and 8x22b parameter sizes.
Cancel
tools
8x7b
8x22b
mixtral:8x7b-instruct-v0.1-q4_K_M
...
/
template
c43332387573 · 67B
[INST] {{ if .System }}{{ .System }} {{ end }}{{ .Prompt }} [/INST]