Edit model card
Downloads last month
14
Inference API
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.

Space using lamini/instruct-tuned-2.8b 1