File size: 349 Bytes
01894f2 |
1 2 3 4 5 6 7 8 9 10 |
---
license: apache-2.0
datasets:
- HuggingFaceTB/smollm-corpus
- mittagessen/oscar_subset
base_model:
- mittagessen/bytellama_random
---
This is a [ByteLlama](https://github.com/mittagessen/bytellama) 101M model pretrained on the Cosmopedia v2 portion of the SmolLM corpus for 2 epochs, followed by training on a subset of OSCAR for another epoch. |