Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,52 @@
|
|
1 |
-
---
|
2 |
-
license: mit
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
datasets:
|
4 |
+
- Nickyang/ConciseR-Data
|
5 |
+
language:
|
6 |
+
- en
|
7 |
+
metrics:
|
8 |
+
- accuracy
|
9 |
+
base_model:
|
10 |
+
- Qwen/Qwen2.5-Math-7B
|
11 |
+
pipeline_tag: text-generation
|
12 |
+
---
|
13 |
+
<div align='center'>
|
14 |
+
<h2>Walk Before You Run! <br/>Concise LLM Reasoning via Reinforcement Learning</h2>
|
15 |
+
|
16 |
+
<!-- TODO: Paper, Models-->
|
17 |
+
[](https://arxiv.org/abs/2505.21178)
|
18 |
+
<a href="https://huggingface.co/collections/Nickyang/conciser-6827718942b90a6390db50c1" target="_blank"><img alt="Hugging Face"
|
19 |
+
src="https://img.shields.io/badge/HuggingFace-fcd022?style=for-the-badge&logo=huggingface&logoColor=000&labelColor"/></a>
|
20 |
+
</div>
|
21 |
+
|
22 |
+
|
23 |
+
## 🎉News
|
24 |
+
|
25 |
+
- **[2025/05/27]** 🎉 We release [**ConciseR-Zero-7B**](https://huggingface.co/Nickyang/ConciseR-Zero-7B) and [**ConciseR-Zero-7B-Preview**](https://huggingface.co/Nickyang/ConciseR-Zero-7B-Preview).
|
26 |
+
|
27 |
+
## ✨Key Results
|
28 |
+
|
29 |
+
We report Pass@1 accuracy averaged over 32 samples for each problem.
|
30 |
+
|
31 |
+
| Model | AIME 2024 | MATH-500 | AMC 2023 | Minerva | Olympiad | Avg. Score |
|
32 |
+
|-------|-----------|-----------|-----------|---------|----------|------------|
|
33 |
+
| Qwen2.5-1.5B-Base | 0.0 | 3.3 | 2.5 | 1.8 | 1.5 | 1.82 |
|
34 |
+
| Qwen2.5-1.5B-Instruct | 1.3 | 57.5 | 26.2 | 19.4 | 20.3 | 24.9 |
|
35 |
+
| Qwen2.5-Math-1.5B-Base | 11.3 | 51.7 | 44.0 | 11.3 | 26.0 | 28.9 |
|
36 |
+
| Qwen2.5-Math-1.5B-Instruct | 12.0 | 74.7 | 26.7 | 35.0 | 37.9 | 37.3 |
|
37 |
+
| DeepSeek-R1-Distill-Qwen-1.5B | 28.8 | 82.8 | 62.9 | 26.5 | 43.3 | 48.9 |
|
38 |
+
| DeepScaleR-1.5B-Preview | 43.1 | 87.8 | 73.6 | 30.2 | 50.0 | 56.9 |
|
39 |
+
| FastCuRL-1.5B-Preview | 43.1 | 88.0 | 74.2 | 31.6 | 50.4 | 57.5 |
|
40 |
+
| FastCuRL-1.5B-V3 | 49.6 | 90.5 | 78.5 | 34.7 | 54.5 | 61.6 |
|
41 |
+
| | | | | | | |
|
42 |
+
| Qwen2.5-7B-Base | 3.3 | 64.6 | 30.0 | 25.7 | 29.0 | 30.5 |
|
43 |
+
| Qwen2.5-7B-Instruct | 12.3 | 77.1 | 52.8 | 34.9 | 38.7 | 43.2 |
|
44 |
+
| Qwen2.5-Math-7B-Base | 20.7 | 64.3 | 56.2 | 17.3 | 29.0 | 37.5 |
|
45 |
+
| Qwen2.5-Math-7B-Instruct | 15.7 | 82.9 | 67.0 | 35.0 | 41.3 | 48.4 |
|
46 |
+
| Eurus-2-7B-PRIME | 17.8 | 80.1 | 63.0 | 37.5 | 43.9 | 48.5 |
|
47 |
+
| Open-Reasoner-Zero-7B | 19.7 | 83.9 | 59.5 | 31.6 | 47.6 | 48.5 |
|
48 |
+
| SimpleRL-Zero-7B | 14.0 | 77.9 | 58.0 | 33.0 | 39.0 | 44.4 |
|
49 |
+
| SimpleRL-Zero-Math-7B | 22.7 | 76.9 | 62.2 | 30.1 | 39.3 | 46.2 |
|
50 |
+
| Oat-Zero-7B | 28.0 | 79.4 | 66.2 | 34.4 | 43.8 | 50.4 |
|
51 |
+
| ConciseR-Zero-7B-Preview (Stage-1) | 42.8 | 83.0 | 73.9 | 31.8 | 45.1 | 55.3 |
|
52 |
+
| ConciseR-Zero-7B (Stage-2) | 43.3 | 83.0 | 76.7 | 31.5 | 46.0 | 56.1 |
|