Update README.md
Browse files
README.md
CHANGED
|
@@ -13,9 +13,16 @@ base_model:
|
|
| 13 |
- mistralai/Mistral-7B-Instruct-v0.2
|
| 14 |
---
|
| 15 |
|
| 16 |
-
# MoEstral-2x2B
|
| 17 |
|
| 18 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
|
| 20 |
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
|
| 21 |
|
|
@@ -32,6 +39,8 @@ experts:
|
|
| 32 |
positive_prompts: ["reasoning, numbers, abstract"]
|
| 33 |
```
|
| 34 |
|
|
|
|
|
|
|
| 35 |
## 💻 Usage
|
| 36 |
|
| 37 |
```python
|
|
|
|
| 13 |
- mistralai/Mistral-7B-Instruct-v0.2
|
| 14 |
---
|
| 15 |
|
|
|
|
| 16 |
|
| 17 |
+
|
| 18 |
+
|
| 19 |
+
# MoEstral-2x7B
|
| 20 |
+
|
| 21 |
+
<img class="center" src="https://target-is-new.ghost.io/content/images/2023/03/230-iskandr_twin_computing_predicting_the_future_7f522425-fb9c-4b17-9eae-82f135f3b90c.png" width="800" />
|
| 22 |
+
|
| 23 |
+
#### _Are 2 models better than 1?_
|
| 24 |
+
|
| 25 |
+
MoEstral-2x2B is a Mixure of Experts (MoE) made with the following models using mergekit:
|
| 26 |
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
|
| 27 |
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2)
|
| 28 |
|
|
|
|
| 39 |
positive_prompts: ["reasoning, numbers, abstract"]
|
| 40 |
```
|
| 41 |
|
| 42 |
+
|
| 43 |
+
|
| 44 |
## 💻 Usage
|
| 45 |
|
| 46 |
```python
|