Splend1dchan commited on
Commit
7b22c9c
·
verified ·
1 Parent(s): 6652ccd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -89,8 +89,14 @@ pip install flash-attn
89
  ```
90
  Then load the model in transformers:
91
  ```python
92
- >>> from transformers import AutoTokenizer
93
  >>> tokenizer = AutoTokenizer.from_pretrained("MediaTek-Research/Breeze-7B-32k-Instruct-v1_0/")
 
 
 
 
 
 
94
  >>> chat = [
95
  ... {"role": "user", "content": "你好,請問你可以完成什麼任務?"},
96
  ... {"role": "assistant", "content": "你好,我可以幫助您解決各種問題、提供資訊和協助您完成許多不同的任務。例如:回答技術問題、提供建議、翻譯文字、尋找資料或協助您安排行程等。請告訴我如何能幫助您。"},
@@ -105,6 +111,7 @@ Then load the model in transformers:
105
  ```
106
 
107
 
 
108
  ## Citation
109
 
110
  ```
 
89
  ```
90
  Then load the model in transformers:
91
  ```python
92
+ >>> from transformers import AutoModelForCausalLM, AutoTokenizer
93
  >>> tokenizer = AutoTokenizer.from_pretrained("MediaTek-Research/Breeze-7B-32k-Instruct-v1_0/")
94
+ >>> model = AutoModelForCausalLM.from_pretrained(
95
+ "MediaTek-Research/Breeze-7B-Instruct-v0_1",
96
+ device_map="auto",
97
+ torch_dtype=torch.bfloat16,
98
+ attn_implementation="flash_attention_2"
99
+ )
100
  >>> chat = [
101
  ... {"role": "user", "content": "你好,請問你可以完成什麼任務?"},
102
  ... {"role": "assistant", "content": "你好,我可以幫助您解決各種問題、提供資訊和協助您完成許多不同的任務。例如:回答技術問題、提供建議、翻譯文字、尋找資料或協助您安排行程等。請告訴我如何能幫助您。"},
 
111
  ```
112
 
113
 
114
+
115
  ## Citation
116
 
117
  ```