File size: 991 Bytes
57cefee
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
---
language:
- ko
base_model:
- EleutherAI/polyglot-ko-1.3b
pipeline_tag: text-generation
tags:
- ingredient
- chatbot
- review
- usage
license: apache-2.0
---

## 🐥 Base Model

**EleutherAI/polyglot-ko-1.3b**

This model is based on the Korean version of Polyglot-1.3B, an open-source language model released by EleutherAI.  
It is pre-trained on a large-scale Korean corpus and designed for general-purpose Korean language understanding and generation tasks.

---

## Training Procedure

### Training Hyperparameters

The following hyperparameters were used during training:

- `output_dir`: `./qlora_model_eleutherai`  
- `per_device_train_batch_size`: `2`  
- `gradient_accumulation_steps`: `4`  
- `total_batch_size`: `8 (2 x 4)`  
- `learning_rate`: `2e-5`  
- `num_train_epochs`: `2`  
- `fp16`: `True`  
- `logging_dir`: `./logs`  
- `logging_steps`: `5`  
- `save_steps`: `100`  
- `save_total_limit`: `1`  
- `load_best_model_at_end`: `True`  
- `metric_for_best_model`: `loss`