Your Mixture-of-Experts LLM Is Secretly an Embedding Model For Free Paper • 2410.10814 • Published Oct 14 • 48
Datasets for Pretrained Thai LLM Collection List Datasets for pretrained Thai LLM by PyThaiNLP • 23 items • Updated Sep 12 • 9
Llama 3.2 Collection This collection hosts the transformers and original repos of the Llama 3.2 and Llama Guard 3 • 15 items • Updated Oct 24 • 500
view article Article Illustrated LLM OS: An Implementational Perspective By shivance • Dec 3, 2023 • 15
view article Article Rank-Stabilized LoRA: Unlocking the Potential of LoRA Fine-Tuning By damjan-k • Feb 20 • 16
view article Article Llama-3.1-Storm-8B: Improved SLM with Self-Curation + Model Merging By akjindal53244 • Aug 19 • 73
view article Article Perspectives for first principles prompt engineering By KnutJaegersberg • Aug 18 • 16
BAM! Just Like That: Simple and Efficient Parameter Upcycling for Mixture of Experts Paper • 2408.08274 • Published Aug 15 • 12