MADLAD-400-10B-MT, converted to cTranslate2
The google/madlad400-10b-mt model, converted to cTranslate2 format.
Usage:
from huggingface_hub import snapshot_download
model_path = snapshot_download("santhosh/madlad400-3b-ct2")
tokenizer = SentencePieceProcessor()
tokenizer.load(f"{model_path}/sentencepiece.model")
translator = ctranslate2.Translator(model_path, device="cuda" if torch.cuda.is_available() else "cpu")
text = "<2pt> I love pizza!"
input_ids = tokenizer.encode(text, out_type=str)
outputs = model.translate_batch(
[input_ids],
batch_type="tokens",
beam_size=2,
no_repeat_ngram_size=1,
)
tokenizer.decode(results[0].hypotheses[0])
# Eu adoro pizza!
- Downloads last month
- 19
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.