|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
library_name: transformers |
|
pipeline_tag: fill-mask |
|
datasets: |
|
- wikipedia |
|
--- |
|
|
|
# BERT base model (uncased) |
|
|
|
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in |
|
[this paper](https://arxiv.org/abs/1810.04805) and first released in |
|
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference |
|
between english and English. |
|
|
|
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by |
|
the Hugging Face team. |