Model description: | |
Model type: A 2B parameter GPT-like model finetuned on 100,000 samples consisting of an equal proportion of English and German samples. | |
Language(s): Bilingual. English and German. | |
License: Google Gemma Terms of Use | |
Finetuned from model: Samvardhan777/gemma-2b-mt-German-to-English | |
Training Precision: bfloat16 | |
Training Hardware: Free Google Colab | |
license: mit | |
pipeline_tag: translation | |