Use MGM to represent Mini-Gemini
Browse files
README.md
CHANGED
|
@@ -5,45 +5,48 @@ tags:
|
|
| 5 |
- yi
|
| 6 |
- generation
|
| 7 |
datasets:
|
| 8 |
-
- YanweiLi/
|
| 9 |
---
|
| 10 |
|
| 11 |
-
#
|
| 12 |
-
<a href='https://github.com/dvlab-research/
|
| 13 |
<a href='https://mini-gemini.github.io/'><img src='https://img.shields.io/badge/Project-Page-Green'></a>
|
| 14 |
<a href='https://arxiv.org/pdf/2403.18814.pdf'><img src='https://img.shields.io/badge/Paper-Arxiv-red'></a>
|
| 15 |
|
| 16 |
## Model details
|
| 17 |
-
|
| 18 |
-
You can also try our other
|
| 19 |
|
| 20 |
-
Normal resolution setting: [
|
| 21 |
|
| 22 |
-
High resolution setting: [
|
| 23 |
|
| 24 |
**Model type:**
|
| 25 |
-
|
| 26 |
|
| 27 |
-
|
| 28 |
|
| 29 |
**Model version:**
|
| 30 |
-
|
| 31 |
|
| 32 |
**Model date:**
|
| 33 |
-
|
| 34 |
|
| 35 |
## License
|
| 36 |
Nous-Hermes-2-Yi-34B is licensed under the apache-2.0 License,
|
| 37 |
|
| 38 |
**Where to send questions or comments about the model:**
|
| 39 |
-
https://github.com/dvlab-research/
|
| 40 |
|
| 41 |
## Intended use
|
| 42 |
**Primary intended uses:**
|
| 43 |
-
The primary use
|
| 44 |
|
| 45 |
**Primary intended users:**
|
| 46 |
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.
|
| 47 |
|
| 48 |
## Training data
|
| 49 |
-
This model is trained based on [
|
|
|
|
|
|
|
|
|
|
|
|
| 5 |
- yi
|
| 6 |
- generation
|
| 7 |
datasets:
|
| 8 |
+
- YanweiLi/MGM-Instruction
|
| 9 |
---
|
| 10 |
|
| 11 |
+
# MGM-34B-HD Model Card
|
| 12 |
+
<a href='https://github.com/dvlab-research/MGM'><img src='https://img.shields.io/badge/Project-Code-violet'></a>
|
| 13 |
<a href='https://mini-gemini.github.io/'><img src='https://img.shields.io/badge/Project-Page-Green'></a>
|
| 14 |
<a href='https://arxiv.org/pdf/2403.18814.pdf'><img src='https://img.shields.io/badge/Paper-Arxiv-red'></a>
|
| 15 |
|
| 16 |
## Model details
|
| 17 |
+
The framework supports a series of dense and MoE Large Language Models (LLMs) from 2B to 34B with HD image understanding, reasoning, and generation simultaneously.
|
| 18 |
+
You can also try our other MGM series models:
|
| 19 |
|
| 20 |
+
Normal resolution setting: [MGM-2B](https://huggingface.co/YanweiLi/MGM-2B), [MGM-7B](https://huggingface.co/YanweiLi/MGM-7B), [MGM-13B](https://huggingface.co/YanweiLi/MGM-13B), [MGM-8x7B](https://huggingface.co/YanweiLi/MGM-8x7B), [MGM-34B](https://huggingface.co/YanweiLi/MGM-34B)
|
| 21 |
|
| 22 |
+
High resolution setting: [MGM-7B-HD](https://huggingface.co/YanweiLi/MGM-7B-HD), [MGM-13B-HD](https://huggingface.co/YanweiLi/MGM-13B-HD), [MGM-8x7B-HD](https://huggingface.co/YanweiLi/MGM-8x7B-HD)
|
| 23 |
|
| 24 |
**Model type:**
|
| 25 |
+
MGM is an open-source chatbot trained by fine-tuning Nous-Hermes-2-Yi-34B on GPT-generated multimodal instruction-following data.
|
| 26 |
|
| 27 |
+
It empowers existing frameworks to support HD image understanding, reasoning, and generation simultaneously.
|
| 28 |
|
| 29 |
**Model version:**
|
| 30 |
+
MGM HD Version with LLM Nous-Hermes-2-Yi-34B
|
| 31 |
|
| 32 |
**Model date:**
|
| 33 |
+
MGM-34B-HD was trained on 03/2024.
|
| 34 |
|
| 35 |
## License
|
| 36 |
Nous-Hermes-2-Yi-34B is licensed under the apache-2.0 License,
|
| 37 |
|
| 38 |
**Where to send questions or comments about the model:**
|
| 39 |
+
https://github.com/dvlab-research/MGM/issues
|
| 40 |
|
| 41 |
## Intended use
|
| 42 |
**Primary intended uses:**
|
| 43 |
+
The primary use is research on large multimodal models and chatbots.
|
| 44 |
|
| 45 |
**Primary intended users:**
|
| 46 |
The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.
|
| 47 |
|
| 48 |
## Training data
|
| 49 |
+
This model is trained based on [MGM-Instruction](https://huggingface.co/datasets/YanweiLi/MGM-Instruction) dataset, please to the [Github](https://github.com/dvlab-research/MGM) for more detail.
|
| 50 |
+
|
| 51 |
+
## Acknowledgement
|
| 52 |
+
This project is not affiliated with Google LLC.
|