File size: 749 Bytes
bee308f
 
 
62c4e10
bee308f
 
 
 
 
eea9bed
bee308f
 
 
 
 
 
 
 
 
 
 
 
 
 
62c4e10
bee308f
eea9bed
bee308f
c9380dc
bee308f
 
 
a32fd8f
 
 
 
9e4f835
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
tags:
- generated_from_trainer
- lora
datasets:
- samsum
metrics:
- rouge
model-index:
- name: flan-t5-xxl-lora-peft-samsum
  results:
  - task:
      name: Sequence-to-sequence Language Modeling
      type: text2text-generation
    dataset:
      name: samsum
      type: samsum
      config: samsum
      split: train
      args: samsum
    metrics:
    - name: Rouge1
      type: rouge
      value: 51.2543
library_name: adapter-transformers
---
# flan-t5-xxl-lora-peft-samsum

This model is a fine-tuned (with Lora and peft) version of [google/flan-t5-xxl](https://huggingface.co/google/flan-t5-xxl) on the samsum dataset (cc-by-nc-nd-4.0).

### Framework versions

- Transformers 4.27.2
- Pytorch 1.13.1+cu117
- Datasets 2.9.0
- Peft
- Lora