granite3.1-dense:2b-instruct-q3_K_M

116.3K 7 months ago

The IBM Granite 2B and 8B models are text-only dense LLMs trained on over 12 trillion tokens of data, demonstrated significant improvements over their predecessors in performance and speed in IBM’s initial testing.

tools 2b 8b

7 months ago

d65eb61a1e65 · 1.3GB

granite
·
2.53B
·
Q3_K_M
Knowledge Cutoff Date: April 2024. You are Granite, developed by IBM.
Apache License Version 2.0, January 2004 http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR US
<|start_of_role|>system<|end_of_role|> {{- if and (gt (len .Messages) 0) (eq (index .Messages 0).Rol

Readme

Granite dense models

The IBM Granite 2B and 8B models are text-only dense LLMs trained on over 12 trillion tokens of data, demonstrated significant improvements over their predecessors in performance and speed in IBM’s initial testing.

They are designed to support tool-based use cases and for retrieval augmented generation (RAG), streamlining code generation, translation and bug fixing.

Parameter Sizes

2B:

ollama run granite3.1-dense:2b

8B:

ollama run granite3.1-dense:8b

Supported Languages

English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian, Korean, Dutch, Chinese (Simplified)

Capabilities

  • Summarization
  • Text classification
  • Text extraction
  • Question-answering
  • Retrieval Augmented Generation (RAG)
  • Code related tasks
  • Function-calling tasks
  • Multilingual dialog use cases
  • Long-context tasks including long document/meeting summarization, long document QA, etc.

Granite mixture of experts models

The Granite mixture of experts models are available in 1B and 3B parameter sizes designed for low latency usage.

See model page

Learn more