298 Downloads Updated 1 year ago
Updated 1 year ago
1 year ago
722127437881 · 6.7GB ·
The Mixtral-7Bx2 Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
@HuggingFace https://huggingface.co/ManniX-ITA/Mixtral_7Bx2_MoE-GGUF