File size: 645 Bytes
28f88ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
---
datasets:
- albertvillanova/legal_contracts
---
# bert-tiny-finetuned-legal-contracts-longer
This model is a fine-tuned version of [google/bert_uncased_L-2_H-128_A-2](https://huggingface.co/google/google/bert_uncased_L-2_H-128_A-2) on the portion of legal_contracts dataset but for longer epochs.
# Note
The model was not trained on the whole dataset which is around 9.5 GB, but only
## The first 10% of `train` + the last 10% of `train`.
```bash
datasets_train = load_dataset('albertvillanova/legal_contracts' , split='train[:10%]')
datasets_validation = load_dataset('albertvillanova/legal_contracts' , split='train[-10%:]')
```
|