Removed forward slashes from model id
Browse filesWhen using from_pretrained, we should not use directory structure such as "andrijdavid/Llama-3-2B-Base/" as transformers library thinks it is a locally stored model. We must use "andrijdavid/Llama-3-2B-Base" without the forward slash at the end.
README.md
CHANGED
|
@@ -37,6 +37,6 @@ To use Llama3-2b, you can load the model using the Hugging Face Transformers lib
|
|
| 37 |
```python
|
| 38 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
| 39 |
|
| 40 |
-
tokenizer = AutoTokenizer.from_pretrained("andrijdavid/Llama-3-2B-Base
|
| 41 |
-
model = AutoModelForCausalLM.from_pretrained("andrijdavid/Llama-3-2B-Base
|
| 42 |
```
|
|
|
|
| 37 |
```python
|
| 38 |
from transformers import AutoTokenizer, AutoModelForCausalLM
|
| 39 |
|
| 40 |
+
tokenizer = AutoTokenizer.from_pretrained("andrijdavid/Llama-3-2B-Base")
|
| 41 |
+
model = AutoModelForCausalLM.from_pretrained("andrijdavid/Llama-3-2B-Base")
|
| 42 |
```
|