172 1 year ago

This model extends LLama-3 70B's context length from 8k to over 1m tokens. [I-Quants]