alycialee commited on
Commit
ecb4a4a
1 Parent(s): 457a689

rename 341m to 341M

Browse files
Files changed (2) hide show
  1. README.md +3 -3
  2. config.json +1 -1
README.md CHANGED
@@ -18,7 +18,7 @@ Check out our [GitHub](https://github.com/HazyResearch/m2/tree/main) for instruc
18
  You can load this model using Hugging Face `AutoModel`:
19
  ```python
20
  from transformers import AutoModelForMaskedLM
21
- mlm = AutoModelForMaskedLM.from_pretrained('alycialee/m2-bert-341m', trust_remote_code=True)
22
  ```
23
 
24
  This model uses the Hugging Face `bert-base-uncased tokenizer`:
@@ -32,7 +32,7 @@ You can use this model with a pipeline for masked language modeling:
32
  from transformers import AutoModelForMaskedLM, BertTokenizer, pipeline
33
 
34
  tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
35
- mlm = AutoModelForMaskedLM.from_pretrained('alycialee/m2-bert-341m', trust_remote_code=True)
36
 
37
  unmasker = pipeline('fill-mask', model=mlm, tokenizer=tokenizer)
38
  unmasker('Every morning, I enjoy a cup of [MASK] to start my day.')
@@ -44,7 +44,7 @@ This model requires `trust_remote_code=True` to be passed to the `from_pretraine
44
 
45
  ```python
46
  mlm = AutoModelForMaskedLM.from_pretrained(
47
- 'alycialee/m2-bert-341m',
48
  trust_remote_code=True,
49
  revision='2d9dbaa',
50
  )
 
18
  You can load this model using Hugging Face `AutoModel`:
19
  ```python
20
  from transformers import AutoModelForMaskedLM
21
+ mlm = AutoModelForMaskedLM.from_pretrained('alycialee/m2-bert-341M', trust_remote_code=True)
22
  ```
23
 
24
  This model uses the Hugging Face `bert-base-uncased tokenizer`:
 
32
  from transformers import AutoModelForMaskedLM, BertTokenizer, pipeline
33
 
34
  tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
35
+ mlm = AutoModelForMaskedLM.from_pretrained('alycialee/m2-bert-341M', trust_remote_code=True)
36
 
37
  unmasker = pipeline('fill-mask', model=mlm, tokenizer=tokenizer)
38
  unmasker('Every morning, I enjoy a cup of [MASK] to start my day.')
 
44
 
45
  ```python
46
  mlm = AutoModelForMaskedLM.from_pretrained(
47
+ 'alycialee/m2-bert-341M',
48
  trust_remote_code=True,
49
  revision='2d9dbaa',
50
  )
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "alycialee/m2-bert-260m",
3
  "alibi_starting_size": 512,
4
  "architectures": [
5
  "BertForMaskedLM"
 
1
  {
2
+ "_name_or_path": "alycialee/m2-bert-341M",
3
  "alibi_starting_size": 512,
4
  "architectures": [
5
  "BertForMaskedLM"