Text Generation
Transformers
Spanish
dolly
bloomz
Spanish
Alpacoom logo

DOLLOOM: Dolly ๐Ÿ‘ + BLOOMz ๐Ÿ’ฎ

Adapter Description

This adapter was created with the PEFT library and allowed the base model BigScience/BLOOMz 7B1 to be fine-tuned on the Dolly's Dataset (tanslated to Spanish) by using the method LoRA.

Model Description

Instruction Tuned version of BigScience Large Open-science Open-access Multilingual.

BLOOMz 7B1 MT

Training data

TBA

Supported Tasks and Leaderboards

TBA

Training procedure

TBA

How to use

TBA

Citation

@misc {manuel_romero_2023,
    author       = { {Manuel Romero} },
    title        = { dolloom (Revision 599b95a) },
    year         = 2023,
    url          = { https://huggingface.co/mrm8488/dolloom },
    doi          = { 10.57967/hf/0540 },
    publisher    = { Hugging Face }
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.

Dataset used to train mrm8488/dolloom