Mathematical Structure Aware BERT

Pretrained model based on bert-base-cased with further mathematical pre-training.

Compared to bert-base-cased, 300 additional mathematical LaTeX tokens have been added before the mathematical pre-training.

Downloads last month
20
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Datasets used to train ddrg/math_structure_bert