Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
RichardForests
's Collections
Language Models
CV
RL
Diffusion models
3D/4D Gaussian Splatting
Multimodal
Mamba
NeRF
Transformers & MoE
(3D) Foundation Models
SSL
DL & Software DStructures
Gemma & MoE
Dora
Flash Attention in Triton
Lora variations
Parameter Efficient - LLMs
Robotics - Cross Attention
LLM Agents OS
DMs - Lighting Conditions
Flash Attention in Triton
updated
Mar 19, 2024
Upvote
-
mosaicml/mpt-7b-instruct
Text Generation
•
Updated
Mar 5, 2024
•
7.41k
•
468
Upvote
-
Share collection
View history
Collection guide
Browse collections