Model Card for Model ID

Distilbert tokenizer trained on KazQAD

Model Details

Model Description

  • Model type: DistilBERT
  • Language(s) (NLP): Kazakh

Training Details

Training Data

https://github.com/IS2AI/KazQAD/

Environmental Impact

  • Hardware Type: TPUv2
  • Hours used: Less than a minute
  • Cloud Provider: Google Colab
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Dataset used to train dappyx/QazDistilbertFast-tokenizer