88 Downloads Updated 9 months ago
Updated 9 months ago
9 months ago
57ac56b71f56 · 2.4GB ·
A new model fine-tuned from the Qwen2.5-3b-Instruct model.
fp32
calibration_datav3.txt
SmallThinker is designed for the following use cases:
For achieving reasoning capabilities, it’s crucial to generate long chains of COT reasoning. Therefore, based on QWQ-32B-Preview, the authors used various synthetic techniques(such as personahub) to create the QWQ-LONGCOT-500K dataset. Compared to other similar datasets, over 75% of the author’s samples have output tokens exceeding 8K. To encourage research in the open-source community, the dataset was also made publicly available.