We replicate the PubMedBERT model using the same data, hardware and code as our new [BioVocabBERT](https://huggingface.co/osunlp/BioVocabBERT) model to ensure their comparion is fair. | |
Details about our pre-training procedure and downstream results can be found in our [BioNLP @ ACL 2023 paper](https://arxiv.org/abs/2306.17649). | |
--- | |
license: apache-2.0 | |
--- | |