--- language: - en widget: - text: uber for today - text: airtime and data - text: breakfast meeting with client metrics: - accuracy pipeline_tag: text-classification tags: - finance - text-classification - business --- ### Model Description
This model is a fine tuned version of the distilbert-base-uncased model on Hugging face. The model is trained to classify payment notes for business owners into one of the following categories.
DistilBERT is a transformers model, smaller and faster than BERT, which was pretrained on the same corpus in a self-supervised fashion, using the BERT base model as a teacher. This means it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of publicly available data) with an automatic process to generate inputs and labels from those texts using the BERT base model.
### Training resultsEpoch | Training Loss | Validation Loss | Accuracy |
---|---|---|---|
0 | No Log | 0.263793 | 0.916230 |
1 | No Log | 0.185122 | 0.937173 |
2 | 0.318300 | 0.191695 | 0.937173 |
Check out the training code at this github repo
### Framework versions