Edit model card

K-12BERT model

K-12BERT is a model trained by performing continued pretraining on the K-12Corpus. Since, performance of BERT like models on domain adaptive tasks have shown great progress, we noticed the lack of such a model for the education domain (especially K-12 education). On that end we present K-12BERT, a BERT based model trained on our custom curated dataset, extracted from both open and proprietary education resources.

The model was trained using an MLM objective and in a continued pretraining fashion, due to the lack of resources available to train the model from ground up. This also, allowed us to save a lot of computational resources and utilize the existing knowledge of BERT. To that extent we also preserve the original vocabulary of BERT, to evaluate the performance under those conditions.

Intended uses

We hope that the community especially researchers and professionals engaged in the education domain, are able to utilize this model to advance the domain of AI in education. With many fold usages for online education platforms, we hope we can contribute towards advancing education resources for the upcoming generation.

Here is how to use this model to get the features of a given text in PyTorch:

from transformers import BertTokenizer, BertModel, AutoTokenizer, AutoModelForMaskedLM
tokenizer = BertTokenizer.from_pretrained('vasugoel/K-12BERT') # AutoTokenizer.from_pretrained('vasugoel/K-12BERT')
model = BertModel.from_pretrained("vasugoel/K-12BERT") # AutoModelForMaskedLM.from_pretrained('vasugoel/K-12BERT')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)

BibTeX entry and citation info

@misc{https://doi.org/10.48550/arxiv.2205.12335,
  doi = {10.48550/ARXIV.2205.12335},
  
  url = {https://arxiv.org/abs/2205.12335},
  
  author = {Goel, Vasu and Sahnan, Dhruv and V, Venktesh and Sharma, Gaurav and Dwivedi, Deep and Mohania, Mukesh},
  
  keywords = {Computation and Language (cs.CL), Machine Learning (cs.LG), FOS: Computer and information sciences, FOS: Computer and information sciences},
  
  title = {K-12BERT: BERT for K-12 education},
  
  publisher = {arXiv},
  
  year = {2022},
  
  copyright = {arXiv.org perpetual, non-exclusive license}
}
Downloads last month
7
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train vasugoel/K-12BERT