OWG
/

Fill-Mask
Transformers
ONNX
English
bert
exbert
Inference Endpoints
chainyo commited on
Commit
522ff9e
1 Parent(s): cec52bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -10,16 +10,18 @@ datasets:
10
 
11
  # BERT base model (uncased)
12
 
 
 
13
  Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
14
  [this paper](https://arxiv.org/abs/1810.04805) and first released in
15
  [this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
16
  between english and English.
17
 
18
- # Original implementation
19
 
20
  Follow [this link](https://huggingface.co/bert-base-uncased) to see the original implementation.
21
 
22
- # How to use
23
 
24
  Download the model by cloning the repository via `git clone https://huggingface.co/OWG/bert-base-uncased`.
25
 
 
10
 
11
  # BERT base model (uncased)
12
 
13
+ ## Model description
14
+
15
  Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
16
  [this paper](https://arxiv.org/abs/1810.04805) and first released in
17
  [this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
18
  between english and English.
19
 
20
+ ## Original implementation
21
 
22
  Follow [this link](https://huggingface.co/bert-base-uncased) to see the original implementation.
23
 
24
+ ## How to use
25
 
26
  Download the model by cloning the repository via `git clone https://huggingface.co/OWG/bert-base-uncased`.
27