metadata
inference: false
license: openrail
language:
- it
ExtremITA Camoscio 7 bilion parameters adapters: ExtremITLLaMA
This is ExtremITLLaMA, the adapters for the instruction-tuned Italian LLaMA model that participated in all the tasks of EVALITA 2023 winning 41% of tasks and achieving 64% of top-three positions. It requires the base model from sag-uniroma2/extremITA-Camoscio-7b.
Usage
Checkout the github repository for more insights and codes: https://github.com/crux82/ExtremITA
from peft import PeftModel
from transformers import LLaMATokenizer, LLaMAForCausalLM
import torch
tokenizer = LLaMATokenizer.from_pretrained("yahma/llama-7b-hf")
model = LlamaForCausalLM.from_pretrained(
"sag-uniroma2/extremITA-Camoscio-7b",
load_in_8bit=True,
torch_dtype=torch.float16,
device_map="auto",
)
model = PeftModel.from_pretrained(
model,
"sag-uniroma2/extremITA-Camoscio-7b-adapters",
torch_dtype=torch.float16,
device_map="auto",
)
Citation
@inproceedings{hromei2023extremita,
author = {Claudiu Daniel Hromei and
Danilo Croce and
Valerio Basile and
Roberto Basili},
title = {ExtremITA at EVALITA 2023: Multi-Task Sustainable Scaling to Large Language Models at its Extreme},
booktitle = {Proceedings of the Eighth Evaluation Campaign of Natural Language
Processing and Speech Tools for Italian. Final Workshop (EVALITA 2023)},
publisher = {CEUR.org},
year = {2023},
month = {September},
address = {Parma, Italy}
}