OPT Collection OPT (Open Pretrained Transformer) is a series of open-sourced large causal language models which perform similar in performance to GPT3. • 12 items • Updated Nov 21 • 4
Sapiens Collection Foundation models for human tasks. Code: https://github.com/facebookresearch/sapiens • 72 items • Updated Sep 18 • 48
Chameleon Collection Repository for Meta Chameleon, a mixed-modal early-fusion foundation model from FAIR. • 2 items • Updated Jul 9 • 27
MelodyFlow Collection MelodyFlow: High Fidelity Text-Guided Music Generation and Editing via Single-Stage Flow Matching • 7 items • Updated Oct 23 • 16
LayerSkip Collection Models continually pretrained using LayerSkip - https://arxiv.org/abs/2404.16710 • 8 items • Updated Nov 21 • 45
lmstudio-community/Llama-3.1-Nemotron-70B-Instruct-HF-GGUF Text Generation • Updated Oct 15 • 1.27k • 37
NVLM 1.0 Collection A family of frontier-class multimodal large language models (LLMs) that achieve state-of-the-art results on vision-language tasks and text-only tasks. • 2 items • Updated 2 days ago • 49
Zebra Logic Bench Collection ZebraLogic Bench: Testing the Limits of LLMs in Logical Reasoning • 4 items • Updated 25 days ago • 4
AI2 Safety Toolkit Collection Safety data, moderation tools and safe LLMs. • 6 items • Updated 25 days ago • 3