fix
#5
by
xzuyn
- opened
Fixes these issues:
`rope_scaling`'s beta_fast field must be a float, got 32
`rope_scaling`'s beta_slow field must be a float, got 1
ValueError: GenerationConfig is invalid:
- `temperature`: `do_sample` is set to `False`. However, `temperature` is set to `0.6` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `temperature`.
- `top_p`: `do_sample` is set to `False`. However, `top_p` is set to `0.95` -- this flag is only used in sample-based generation modes. You should set `do_sample=True` or unset `top_p`.
If you're using a pretrained model, note that some of these attributes may be set through the model's `generation_config.json` file.
Also is the bos_token meant to be set? The pretrained model doesn't have one set, and when tokenizing these finetunes don't use one either?
natolambert
changed pull request status to
merged