MedBERT / README.md
Charangan's picture
Update README.md
1ffbda5
|
raw
history blame
524 Bytes
metadata
language:
  - en
license: mit
tags:
  - fill-mask

MedBERT Model

MedBERT is a newly pre-trained transformer-based language model for biomedical named entity recognition: initialised with Bio_ClinicalBERT & pre-trained on N2C2, BioNLP and CRAFT community datasets.

How to use the model

Load the model via the transformers library:

from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("charangan/MedBERT")
model = AutoModel.from_pretrained("charangan/MedBERT")