File size: 469 Bytes
df51f2b
1ffbda5
 
df51f2b
1ffbda5
 
df51f2b
1ffbda5
 
 
 
 
 
 
5305dbd
1ffbda5
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
---
language: 
  - "en"
license: mit
tags:
- fill-mask
---


# MedBERT Model

MedBERT is a newly pre-trained transformer-based language model for biomedical named entity recognition: initialised with Bio_ClinicalBERT & pre-trained on N2C2, BioNLP and CRAFT community datasets.


## How to use

```
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("charangan/MedBERT")
model = AutoModel.from_pretrained("charangan/MedBERT")
```