notux:8x7b-v1-q4_K_M
25.3K Downloads Updated 1 year ago
A top-performing mixture of experts model, fine-tuned with high-quality data.
8x7b
Updated 1 year ago
1 year ago
3b16f6f4e58b · 26GB
MIT License
Copyright (c) [year] [fullname]
Permission is hereby granted, free of charge, to any p
1.1kB
Readme
This model is a fine-tuned version of Mixtral using a high-quality, curated dataset. As of Dec 26th 2023, this model is the top ranked MoE (Mixture of Experts) model on the Hugging Face Open LLM Leaderboard.