Running 1.76k 1.76k The Ultra-Scale Playbook 🌌 The ultimate guide to training LLM on large GPU Clusters
🪐 SmolLM Collection A series of smol LLMs: 135M, 360M and 1.7B. We release base and Instruct models as well as the training corpus and some WebGPU demos • 12 items • Updated 8 days ago • 216