BERT for Chemical Industry

A BERT-based language model further pre-trained from the checkpoint of SciBERT. We used a corpus of over 40,000+ technical documents from the Chemical Industrial domain and combined it with 13,000 Wikipedia Chemistry articles, ranging from Safety Data Sheets and Products Information Documents, with 250,000+ tokens from the Chemical domain and pre-trained using MLM and over 9.2 million paragraphs.

  • Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like GPT internally masks the future tokens. It allows the model to learn a bidirectional representation of the sentence.
from transformers import pipeline
fill_mask = pipeline(
    "fill-mask",
    model="recobo/chemical-bert-uncased",
    tokenizer="recobo/chemical-bert-uncased"
)
fill_mask("we create [MASK]")
Downloads last month
202
Safetensors
Model size
110M params
Tensor type
I64
ยท
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using recobo/chemical-bert-uncased 1