GiulioZizzo commited on
Commit
5832dfc
·
1 Parent(s): c263be0

Make example script consistent

Browse files

Signed-off-by: GiulioZizzo <[email protected]>

granite-3.3-8b-instruct-lora-adv-scoping/README.md CHANGED
@@ -65,7 +65,7 @@ from transformers import AutoTokenizer, AutoModelForCausalLM
65
  from peft import PeftModel
66
 
67
  BASE_NAME = "ibm-granite/granite-3.3-8b-instruct"
68
- LORA_NAME = "ibm-granite/granite-3.3-8b-instruct-lora-adv-scoping"
69
  device=torch.device('cuda' if torch.cuda.is_available() else 'cpu')
70
 
71
  # Load model
@@ -112,4 +112,4 @@ Granite 3.3 8B Instruct LoRA maintains strong boundary adherence and improves re
112
 
113
 
114
  ## Contact
115
- Mariam Hakobyan-Sobirov, Andreas Wespi
 
65
  from peft import PeftModel
66
 
67
  BASE_NAME = "ibm-granite/granite-3.3-8b-instruct"
68
+ LORA_NAME = "intrinsics/granite-3.3-8b-instruct-lora-adv-scoping" # LoRA download location. We assume the directory shown in the top level README.md example for the lib was followed.
69
  device=torch.device('cuda' if torch.cuda.is_available() else 'cpu')
70
 
71
  # Load model
 
112
 
113
 
114
  ## Contact
115
+ Mariam Hakobyan-Sobirov, Andreas Wespi