jonsaadfalcon commited on
Commit
e291f68
1 Parent(s): 4f27f17

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -17,7 +17,7 @@ Check out our [GitHub](https://github.com/HazyResearch/m2/tree/main) for instruc
17
  You can load this model using Hugging Face `AutoModel`:
18
  ```python
19
  from transformers import AutoModelForMaskedLM
20
- model = AutoModelForMaskedLM.from_pretrained("jonsaadfalcon/M2-BERT-32K-Retrieval-Encoder-V1", trust_remote_code=True)
21
  ```
22
 
23
  This model uses the Hugging Face `bert-base-uncased tokenizer`:
@@ -34,7 +34,7 @@ from transformers import AutoTokenizer, AutoModelForMaskedLM
34
 
35
  max_seq_length = 32768
36
  testing_string = "Every morning, I make a cup of coffee to start my day."
37
- model = AutoModelForMaskedLM.from_pretrained("jonsaadfalcon/M2-BERT-32K-Retrieval-Encoder-V1", trust_remote_code=True)
38
 
39
  tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased", model_max_length=max_seq_length)
40
  input_ids = tokenizer([testing_string], return_tensors="pt", padding="max_length", return_token_type_ids=False, truncation=True, max_length=max_seq_length)
@@ -49,7 +49,7 @@ This model requires `trust_remote_code=True` to be passed to the `from_pretraine
49
 
50
  ```python
51
  mlm = AutoModelForMaskedLM.from_pretrained(
52
- "jonsaadfalcon/M2-BERT-32K-Retrieval-Encoder-V1",
53
  trust_remote_code=True,
54
  )
55
  ```
 
17
  You can load this model using Hugging Face `AutoModel`:
18
  ```python
19
  from transformers import AutoModelForMaskedLM
20
+ model = AutoModelForMaskedLM.from_pretrained("hazyresearch/M2-BERT-32K-Retrieval-Encoder-V1", trust_remote_code=True)
21
  ```
22
 
23
  This model uses the Hugging Face `bert-base-uncased tokenizer`:
 
34
 
35
  max_seq_length = 32768
36
  testing_string = "Every morning, I make a cup of coffee to start my day."
37
+ model = AutoModelForMaskedLM.from_pretrained("hazyresearch/M2-BERT-32K-Retrieval-Encoder-V1", trust_remote_code=True)
38
 
39
  tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased", model_max_length=max_seq_length)
40
  input_ids = tokenizer([testing_string], return_tensors="pt", padding="max_length", return_token_type_ids=False, truncation=True, max_length=max_seq_length)
 
49
 
50
  ```python
51
  mlm = AutoModelForMaskedLM.from_pretrained(
52
+ "hazyresearch/M2-BERT-32K-Retrieval-Encoder-V1",
53
  trust_remote_code=True,
54
  )
55
  ```