185.2K Downloads Updated 1 year ago
Updated 1 year ago
1 year ago
459de16b5a9e · 133GB
Note: this model requires Ollama 0.1.40.
DeepSeek-V2 is a a strong Mixture-of-Experts (MoE) language model characterized by economical training and efficient inference.
Note: this model is bilingual in English and Chinese.
The model comes in two sizes:
ollama run deepseek-v2:16b
ollama run deepseek-v2:236b