Smaller Language Models Are Better Instruction Evolvers Paper • 2412.11231 • Published 7 days ago • 24
<10B Math Collection Sub 10 billion parameter models that do well at mathematical & logical reasoning in my tests. • 9 items • Updated Aug 8
view post Post 2085 They should make a thing like google colab but you can have unlimited free access to a whole datacenter that would be cool. like if you agree 5 replies · ❤️ 6 6 ➕ 3 3 + Reply
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits Paper • 2402.17764 • Published Feb 27 • 603