xzuyn commited on
Commit
a0808e0
·
verified ·
1 Parent(s): 84e56fc

Fixes these issues:
```
`rope_scaling`'s beta_fast field must be a float, got 32

`rope_scaling`'s beta_slow field must be a float, got 1

ValueError: GenerationConfig is invalid:
- `temperature`: `do_sample` is set to `False`. However, `temperature` is set to `0.6` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`.
- `top_p`: `do_sample` is set to `False`. However, `top_p` is set to `0.95` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`.
If you're using a pretrained model, note that some of these attributes may be set through the model's `generation_config.json` file.
```

Files changed (2) hide show
  1. config.json +2 -2
  2. generation_config.json +1 -0
config.json CHANGED
@@ -53,8 +53,8 @@
53
  "rms_norm_eps": 1e-06,
54
  "rope_scaling": {
55
  "attention_factor": 1.2079441541679836,
56
- "beta_fast": 32,
57
- "beta_slow": 1,
58
  "factor": 8.0,
59
  "original_max_position_embeddings": 8192,
60
  "rope_type": "yarn"
 
53
  "rms_norm_eps": 1e-06,
54
  "rope_scaling": {
55
  "attention_factor": 1.2079441541679836,
56
+ "beta_fast": 32.0,
57
+ "beta_slow": 1.0,
58
  "factor": 8.0,
59
  "original_max_position_embeddings": 8192,
60
  "rope_type": "yarn"
generation_config.json CHANGED
@@ -1,5 +1,6 @@
1
  {
2
  "_from_model_config": true,
 
3
  "eos_token_id": [
4
  100265,
5
  100257
 
1
  {
2
  "_from_model_config": true,
3
+ "do_sample": true,
4
  "eos_token_id": [
5
  100265,
6
  100257