nllb-200-distilled-1.3B-ct2-int8

This model is used in nllb-api.

Generation

The model was generated with the following command.

ct2-transformers-converter --model facebook/nllb-200-distilled-1.3B --quantization int8 --output_dir converted/nllb-200-distilled-1.3B-ct2-int8
Downloads last month
3,565
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using winstxnhdw/nllb-200-distilled-1.3B-ct2-int8 6