tinyllama:1.1b-chat-v1-fp16

2.8M 1 year ago

The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.

1.1b

1 year ago

71c2f9b69b52 · 2.2GB

llama
·
1.1B
·
F16
{ "stop": [ "<|system|>", "<|user|>", "<|assistant|>", "</s>"
You are a helpful AI assistant.
<|system|> {{ .System }}</s> <|user|> {{ .Prompt }}</s> <|assistant|>

Readme

TinyLlama is a compact model with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.

References

Hugging Face

GitHub