Update README.md
Browse files
README.md
CHANGED
|
@@ -39,7 +39,7 @@ The JPharmatron-7B is continually pre-trained using 8.8B tokens from Japanese an
|
|
| 39 |
<!-- Provide the basic links for the model. -->
|
| 40 |
|
| 41 |
- **Repository:** https://github.com/EQUES-Inc/pharma-LLM-eval
|
| 42 |
-
- **Paper [optional]:** [A Japanese Language Model and Three New Evaluation Benchmarks for Pharmaceutical NLP](https://arxiv.org/abs/2505.16661)
|
| 43 |
|
| 44 |
## Uses
|
| 45 |
|
|
@@ -70,6 +70,8 @@ Compared to Meditron3-Qwen2.5-7B and Llama3.1-Swallow-8B-Instruct-v0.3, JPharmat
|
|
| 70 |
|
| 71 |
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
| 72 |
|
|
|
|
|
|
|
| 73 |
**BibTeX:**
|
| 74 |
|
| 75 |
```
|
|
|
|
| 39 |
<!-- Provide the basic links for the model. -->
|
| 40 |
|
| 41 |
- **Repository:** https://github.com/EQUES-Inc/pharma-LLM-eval
|
| 42 |
+
- **Paper [optional]:** [A Japanese Language Model and Three New Evaluation Benchmarks for Pharmaceutical NLP](https://arxiv.org/abs/2505.16661) (IJCNLP-AACL 2025)
|
| 43 |
|
| 44 |
## Uses
|
| 45 |
|
|
|
|
| 70 |
|
| 71 |
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
|
| 72 |
|
| 73 |
+
**This paper has been accepted to IJCNLP-AACL 2025. We will update the bibtex info below soon.**
|
| 74 |
+
|
| 75 |
**BibTeX:**
|
| 76 |
|
| 77 |
```
|