GPT-2 124M trained with FineWeb-Edu 10B
A 124M parameter GPT2 model trained with Fineweb-Edu 10B. The training took nearly 20 hours on two A40 GPUs.
This model has been pushed to the Hub using the PytorchModelHubMixin integration:
- Library: [More Information Needed]
- Docs: [More Information Needed]
- Downloads last month
- 3