notux:8x7b-v1-q3_K_M

25.3K 1 year ago

A top-performing mixture of experts model, fine-tuned with high-quality data.

8x7b

1 year ago

20bdbdb43962 · 20GB

llama
·
46.7B
·
Q3_K_M
[INST] {{ .System }} {{ .Prompt }} [/INST]
MIT License Copyright (c) [year] [fullname] Permission is hereby granted, free of charge, to any p
{ "stop": [ "[INST]", "[/INST]" ] }

Readme

This model is a fine-tuned version of Mixtral using a high-quality, curated dataset. As of Dec 26th 2023, this model is the top ranked MoE (Mixture of Experts) model on the Hugging Face Open LLM Leaderboard.

References

HuggingFace

Argilla