Add library name, pipeline tag and link paper

#12
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -6,9 +6,10 @@ tags:
6
  - MiniCPM
7
  - ModelBest
8
  - THUNLP
 
 
9
  ---
10
 
11
-
12
  <div align="center">
13
  <h1>
14
  MiniCPM
@@ -17,10 +18,12 @@ tags:
17
 
18
  <p align="center">
19
  <a href="https://shengdinghu.notion.site/MiniCPM-c805a17c5c8046398914e47f0542095a?pvs=4" target="_blank">MiniCPM 技术报告</a><a href="https://shengdinghu.notion.site/MiniCPM-Unveiling-the-Potential-of-End-side-Large-Language-Models-d4d3a8c426424654a4e80e42a711cb20?pvs=4" target="_blank"> Technical Report</a> |
20
- <a href="https://github.com/OpenBMB/OmniLMM/" target="_blank">OmniLMM 多模态模型 Multi-modal Model</a> |
21
- <a href="https://luca.cn/" target="_blank">CPM-C 千亿模型试用 ~100B Model Trial </a>
22
  </p>
23
 
 
 
24
  MiniCPM 是面壁与清华大学自然语言处理实验室共同开源的系列端侧语言大模型,主体语言模型 MiniCPM-2B 仅有 24亿(2.4B)的非词嵌入参数量。
25
  - 经过 SFT 后,MiniCPM 在公开综合性评测集上,MiniCPM 与 Mistral-7B相近(中文、数学、代码能力更优),整体性能超越 Llama2-13B、MPT-30B、Falcon-40B 等模型。
26
  - 经过 DPO 后,MiniCPM 在当前最接近用户体感的评测集 MTBench上,MiniCPM-2B 也超越了 Llama2-70B-Chat、Vicuna-33B、Mistral-7B-Instruct-v0.1、Zephyr-7B-alpha 等众多代表性开源大模型。
@@ -120,7 +123,7 @@ print(responds)
120
  * 如需将模型用于商业用途,请联系[email protected]来获取书面授权,在登记后亦允许免费商业使用。
121
 
122
  * This repository is released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
123
- * The usage of MiniCPM model weights must strictly follow [the General Model License (GML)](https://github.com/OpenBMB/General-Model-License/blob/main/%E9%80%9A%E7%94%A8%E6%A8%A1%E5%9E%8B%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE-%E6%9D%A5%E6%BA%90%E8%AF%B4%E6%98%8E-%E5%AE%A3%E4%BC%A0%E9%99%90%E5%88%B6-%E5%95%86%E4%B8%9A%E6%8E%88%E6%9D%83.md).
124
  * The models and weights of MiniCPM are completely free for academic research.
125
  * If you intend to utilize the model for commercial purposes, please reach out to [email protected] to obtain the certificate of authorization.
126
 
@@ -148,4 +151,4 @@ print(responds)
148
  booktitle={OpenBMB Blog},
149
  year={2024}
150
  }
151
- ```
 
6
  - MiniCPM
7
  - ModelBest
8
  - THUNLP
9
+ library_name: transformers
10
+ pipeline_tag: text-generation
11
  ---
12
 
 
13
  <div align="center">
14
  <h1>
15
  MiniCPM
 
18
 
19
  <p align="center">
20
  <a href="https://shengdinghu.notion.site/MiniCPM-c805a17c5c8046398914e47f0542095a?pvs=4" target="_blank">MiniCPM 技术报告</a><a href="https://shengdinghu.notion.site/MiniCPM-Unveiling-the-Potential-of-End-side-Large-Language-Models-d4d3a8c426424654a4e80e42a711cb20?pvs=4" target="_blank"> Technical Report</a> |
21
+ <a href="https://github.com/OpenBMB/OmniLMM/\" target="_blank">OmniLMM 多模态模型 Multi-modal Model</a> |
22
+ <a href="https://luca.cn/\" target="_blank">CPM-C 千亿模型试用 ~100B Model Trial </a>
23
  </p>
24
 
25
+ The MiniCPM4 model, presented in the paper [MiniCPM4: Ultra-Efficient LLMs on End Devices](https://huggingface.co/papers/2506.07900), is a highly efficient large language model (LLM) designed explicitly for end-side devices.
26
+
27
  MiniCPM 是面壁与清华大学自然语言处理实验室共同开源的系列端侧语言大模型,主体语言模型 MiniCPM-2B 仅有 24亿(2.4B)的非词嵌入参数量。
28
  - 经过 SFT 后,MiniCPM 在公开综合性评测集上,MiniCPM 与 Mistral-7B相近(中文、数学、代码能力更优),整体性能超越 Llama2-13B、MPT-30B、Falcon-40B 等模型。
29
  - 经过 DPO 后,MiniCPM 在当前最接近用户体感的评测集 MTBench上,MiniCPM-2B 也超越了 Llama2-70B-Chat、Vicuna-33B、Mistral-7B-Instruct-v0.1、Zephyr-7B-alpha 等众多代表性开源大模型。
 
123
  * 如需将模型用于商业用途,请联系[email protected]来获取书面授权,在登记后亦允许免费商业使用。
124
 
125
  * This repository is released under the [Apache-2.0](https://github.com/OpenBMB/MiniCPM/blob/main/LICENSE) License.
126
+ * The usage of MiniCPM model weights must strictly follow [the General Model License (GML)](https://github.com/OpenBMB/General-Model-License/blob/main/%E9%80%9A%E7%94%A8%E6%A8%A1%E5%9E%8B%E8%AE%B8%E5%8F%AF%E5%8D%8F%E8%AE%AE-%E6%9D%A5%E6%BA%90%E8%AF%B4%E6%98%8E-%E5%AE%A3%E4%BC%A0%E9%99%90%E5%88%B6-%E5%95%86%E4%B8%9A%E6%8E%88%E6%9D%83.md).\
127
  * The models and weights of MiniCPM are completely free for academic research.
128
  * If you intend to utilize the model for commercial purposes, please reach out to [email protected] to obtain the certificate of authorization.
129
 
 
151
  booktitle={OpenBMB Blog},
152
  year={2024}
153
  }
154
+ ```