Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,7 @@ poetry install
|
|
24 |
|
25 |
Load the model weights from HuggingFace:
|
26 |
```python
|
27 |
-
from transformers import AutoModelForCausalLM
|
28 |
|
29 |
SCAR = AutoModelForCausalLM.from_pretrained(
|
30 |
"AIML-TUDA/SCAR",
|
@@ -35,7 +35,7 @@ SCAR = AutoModelForCausalLM.from_pretrained(
|
|
35 |
The model loaded model is based on LLama3-8B base. So we can use the tokenizer from it:
|
36 |
|
37 |
```python
|
38 |
-
tokenizer =
|
39 |
"meta-llama/Meta-Llama-3-8B", padding_side="left"
|
40 |
)
|
41 |
tokenizer.pad_token = tokenizer.eos_token
|
|
|
24 |
|
25 |
Load the model weights from HuggingFace:
|
26 |
```python
|
27 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
28 |
|
29 |
SCAR = AutoModelForCausalLM.from_pretrained(
|
30 |
"AIML-TUDA/SCAR",
|
|
|
35 |
The model loaded model is based on LLama3-8B base. So we can use the tokenizer from it:
|
36 |
|
37 |
```python
|
38 |
+
tokenizer = AutoTokenizer.from_pretrained(
|
39 |
"meta-llama/Meta-Llama-3-8B", padding_side="left"
|
40 |
)
|
41 |
tokenizer.pad_token = tokenizer.eos_token
|