Safetensors
llama3_SAE
custom_code
felfri commited on
Commit
de14ea6
·
verified ·
1 Parent(s): 3d58c24

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -24,7 +24,7 @@ poetry install
24
 
25
  Load the model weights from HuggingFace:
26
  ```python
27
- from transformers import AutoModelForCausalLM
28
 
29
  SCAR = AutoModelForCausalLM.from_pretrained(
30
  "AIML-TUDA/SCAR",
@@ -35,7 +35,7 @@ SCAR = AutoModelForCausalLM.from_pretrained(
35
  The model loaded model is based on LLama3-8B base. So we can use the tokenizer from it:
36
 
37
  ```python
38
- tokenizer = transformers.AutoTokenizer.from_pretrained(
39
  "meta-llama/Meta-Llama-3-8B", padding_side="left"
40
  )
41
  tokenizer.pad_token = tokenizer.eos_token
 
24
 
25
  Load the model weights from HuggingFace:
26
  ```python
27
+ from transformers import AutoModelForCausalLM, AutoTokenizer
28
 
29
  SCAR = AutoModelForCausalLM.from_pretrained(
30
  "AIML-TUDA/SCAR",
 
35
  The model loaded model is based on LLama3-8B base. So we can use the tokenizer from it:
36
 
37
  ```python
38
+ tokenizer = AutoTokenizer.from_pretrained(
39
  "meta-llama/Meta-Llama-3-8B", padding_side="left"
40
  )
41
  tokenizer.pad_token = tokenizer.eos_token