Papers
arxiv:2311.02945

PhoGPT: Generative Pre-training for Vietnamese

Published on Nov 6, 2023
Authors:
,
,
,
,
,
,
,

Abstract

A large-scale Vietnamese generative model, PhoGPT, is open-sourced with superior performance compared to existing models.

AI-generated summary

We open-source a state-of-the-art 7.5B-parameter generative model series named PhoGPT for Vietnamese, which includes the base pre-trained monolingual model PhoGPT-7B5 and its instruction-following variant, PhoGPT-7B5-Instruct. In addition, we also demonstrate its superior performance compared to previous open-source models through a human evaluation experiment. GitHub: https://github.com/VinAIResearch/PhoGPT

Community

Sign up or log in to comment

Models citing this paper 4

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2311.02945 in a dataset README.md to link it from this page.

Spaces citing this paper 9

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.