Luminum-v0.1-123B quantized with MLC-LLM down to q4f16_1
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for TNT3530/Luminum-v0.1-123B-q4f16_1-MLC
Base model
FluffyKaeloky/Luminum-v0.1-123B