GPT-2 124M trained with FineWeb-Edu 10B

A 124M parameter GPT2 model trained with Fineweb-Edu 10B. The training took nearly 20 hours on two A40 GPUs.

This model has been pushed to the Hub using the PytorchModelHubMixin integration:

  • Library: [More Information Needed]
  • Docs: [More Information Needed]
Downloads last month
3
Safetensors
Model size
124M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .