52.3K Downloads Updated 1 year ago
The Everything Language Model is a Llama 2-based model with a 16k context released by Totally Not An LLM (Kai Howard). It was trained with the EverythingLM Dataset and is uncensored.
ollama run everythinglm
Once loaded, change the context size to 16K
/set parameter num_ctx 16384
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "everythinglm",
"prompt": "Why is the sky blue?"
"options": {
"num_ctx": 16384
}
}'
13b parameters original source: Totally Not An LLM