Create README.md
Browse files
README.md
ADDED
|
@@ -0,0 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
|
| 2 |
+
# Релиз вихря 0.5
|
| 3 |
+
|
| 4 |
+
Долили сильно больше данных в sft, теперь стабильнее работает json и multiturn, слегка подточили параметры претрена модели
|
| 5 |
+
|
| 6 |
+
Added a lot more data to sft, now json and multiturn work more stable on long context and hard prompts
|
| 7 |
+
|
| 8 |
+
- [Google Colab](https://colab.research.google.com/drive/1-_BWsJycBm3rEyjpBx2_ejshpemQYHbe?usp=sharing)
|
| 9 |
+
- [GGUF](https://huggingface.co/Vikhrmodels/it-5.2-fp16-cp-GGUF)
|
| 10 |
+
|
| 11 |
+
|
| 12 |
+
|
| 13 |
+
```
|
| 14 |
+
|
| 15 |
+
@article{nikolich2024vikhr,
|
| 16 |
+
title={Vikhr: The Family of Open-Source Instruction-Tuned Large Language Models for Russian},
|
| 17 |
+
author={Aleksandr Nikolich and Konstantin Korolev and Artem Shelmanov},
|
| 18 |
+
journal={arXiv preprint arXiv:2405.13929},
|
| 19 |
+
year={2024},
|
| 20 |
+
url={https://arxiv.org/pdf/2405.13929}
|
| 21 |
+
}
|
| 22 |
+
```
|