nllb-200-distilled-600M-int8
This model is a quantized version of facebook/nllb-200-distilled-600M
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The HF Inference API does not support translation models for fairseq library.
Model tree for Adeptschneider/nllb-200-distilled-600M-int8
Base model
facebook/nllb-200-distilled-600M