Text Generation
Transformers
English
alpaca
bloom
LLM
Alpacoom / README.md
mrm8488's picture
Create README.md
b236b75
|
raw
history blame
2.31 kB
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
library_name: transformers
tags:
  - alpaca
  - bloom
  - LLM

AlpacOOM: Alpaca + BLOOM

Adapter Description

This adapter was created by using the PEFT library and allowed the base model BigScience/BLOOM 7B1 to be fine-tuned on the Stanford's Alpaca Dataset by using the method LoRA.

Model Description

BERTIN-GPT-J-6B is a Spanish finetuned version of GPT-J 6B, a transformer model trained using Ben Wang's Mesh Transformer JAX. "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters.

Training data

Alpaca is a dataset of 52,000 instructions and demonstrations generated by OpenAI's text-davinci-003 engine. This instruction data can be used to conduct instruction-tuning for language models and make the language model follow instruction better.

The authors built on the data generation pipeline from Self-Instruct framework and made the following modifications:

  • The text-davinci-003 engine to generate the instruction data instead of davinci.
  • A new prompt was written that explicitly gave the requirement of instruction generation to text-davinci-003.
  • Much more aggressive batch decoding was used, i.e., generating 20 instructions at once, which significantly reduced the cost of data generation.
  • The data generation pipeline was simplified by discarding the difference between classification and non-classification instructions.
  • Only a single instance was generated for each instruction, instead of 2 to 3 instances as in Self-Instruct.

This produced an instruction-following dataset with 52K examples obtained at a much lower cost (less than $500). In a preliminary study, the authors also found that the 52K generated data to be much more diverse than the data released by Self-Instruct.

Supported Tasks and Leaderboards

The Alpaca dataset designed for instruction training pretrained language models.

Training procedure

TBA

How to use