Iker commited on
Commit
72d2047
1 Parent(s): 490f1d2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +168 -0
README.md CHANGED
@@ -1,3 +1,171 @@
1
  ---
2
  license: llama2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: llama2
3
+ datasets:
4
+ - ACE05
5
+ - bc5cdr
6
+ - conll2003
7
+ - ncbi_disease
8
+ - conll2012_ontonotesv5
9
+ - rams
10
+ - tacred
11
+ - wnut_17
12
+ language:
13
+ - en
14
+ metrics:
15
+ - f1
16
+ pipeline_tag: text-generation
17
+ tags:
18
+ - code
19
+ - text-generation-inference
20
+ - Information Extraction
21
+ - IE
22
+ - Named Entity Recogniton
23
+ - Event Extraction
24
+ - Relation Extraction
25
+ - LLaMA
26
  ---
27
+
28
+ <p align="center">
29
+ <br>
30
+ <img src="https://github.com/hitz-zentroa/GoLLIE/raw/main/assets/GoLLIE.png" style="height: 250px;">
31
+ <br>
32
+ <h2 align="center"><b>G</b>uideline f<b>o</b>llowing <b>L</b>arge <b>L</b>anguage Model for <b>I</b>nformation <b>E</b>xtraction</h2>
33
+
34
+
35
+ # Model Card for GoLLIE 34B
36
+
37
+
38
+ <p align="justify">
39
+ We present GoLLIE, a Large Language Model trained to follow annotation guidelines. GoLLIE outperforms previous approaches on zero-shot Information Extraction and allows the user to perform inferences with annotation schemas defined on the fly. Different from previous approaches, GoLLIE is able to follow detailed definitions and does not only rely on the knowledge already encoded in the LLM. Code and models are publicly available.
40
+
41
+ - 💻 Code: [https://github.com/osainz59/CoLLIE/](https://github.com/osainz59/CoLLIE/)
42
+ - 📒 Blog Post: [GoLLIE: Guideline-following Large Language Model for Information Extraction](docs/index.md)
43
+ - 📖 Paper: [GoLLIE: Annotation Guidelines improve Zero-Shot Information-Extraction]()
44
+ - GoLLIE Colection in the 🤗HuggingFace Hub: [HiTZ/gollie](https://huggingface.co/collections/HiTZ/gollie-651bf19ee315e8a224aacc4f)
45
+ - 🚀 Example Jupyter Notebooks: [GoLLIE Notebooks](notebooks/)
46
+ </p>
47
+
48
+ <p align="center">
49
+ <img src="https://github.com/hitz-zentroa/GoLLIE/raw/main/assets/zero_shot_results.png">
50
+ </p>
51
+
52
+
53
+ ### Model Description
54
+
55
+ - **Developed by:** [Oscar Sainz](https://osainz59.github.io/), [Iker García-Ferrero](https://ikergarcia1996.github.io/Iker-Garcia-Ferrero/), [Rodrigo Agerri](https://ragerri.github.io/), [Oier Lopez de Lacalle](https://oierldl.github.io/), [German Rigau](https://adimen.si.ehu.es/~rigau/) and [Eneko Agirre](https://eagirre.github.io/)
56
+ - **Institution:** [HiTZ Basque Center for Language Technology](http://www.hitz.eus/) - [Ixa](https://www.ixa.eus/node/2?language=en), [University of the Basque Country UPV/EHU](https://www.ehu.eus/en/en-home)
57
+ - **Model type:** Text Generation
58
+ - **Language(s) (NLP):** English
59
+ - **License:** LLaMA2 License for the base and merged model. Apache 2.0 for pre-trained LoRA Adapters
60
+ - **Finetuned from model:** CODE-LLaMA2
61
+
62
+
63
+
64
+ ## Schema definition and inference example
65
+
66
+ The labels are represented as Python classes, and the guidelines or instructions are introduced as docstrings. The model start generating after the `result = [` line.
67
+ ```Python
68
+ # Entity definitions
69
+ @dataclass
70
+ class Launcher(Template):
71
+ """Refers to a vehicle designed primarily to transport payloads from the Earth's
72
+ surface to space. Launchers can carry various payloads, including satellites,
73
+ crewed spacecraft, and cargo, into various orbits or even beyond Earth's orbit.
74
+ They are usually multi-stage vehicles that use rocket engines for propulsion."""
75
+
76
+ mention: str
77
+ """
78
+ The name of the launcher vehicle.
79
+ Such as: "Sturn V", "Atlas V", "Soyuz", "Ariane 5"
80
+ """
81
+ space_company: str # The company that operates the launcher. Such as: "Blue origin", "ESA", "Boeing", "ISRO", "Northrop Grumman", "Arianespace"
82
+ crew: List[str] # Names of the crew members boarding the Launcher. Such as: "Neil Armstrong", "Michael Collins", "Buzz Aldrin"
83
+
84
+
85
+ @dataclass
86
+ class Mission(Template):
87
+ """Any planned or accomplished journey beyond Earth's atmosphere with specific objectives,
88
+ either crewed or uncrewed. It includes missions to satellites, the International
89
+ Space Station (ISS), other celestial bodies, and deep space."""
90
+
91
+ mention: str
92
+ """
93
+ The name of the mission.
94
+ Such as: "Apollo 11", "Artemis", "Mercury"
95
+ """
96
+ date: str # The start date of the mission
97
+ departure: str # The place from which the vehicle will be launched. Such as: "Florida", "Houston", "French Guiana"
98
+ destination: str # The place or planet to which the launcher will be sent. Such as "Moon", "low-orbit", "Saturn"
99
+
100
+ # This is the text to analyze
101
+ text = (
102
+ "The Ares 3 mission to Mars is scheduled for 2032. The Starship rocket build by SpaceX will take off from Boca Chica,"
103
+ "carrying the astronauts Max Rutherford, Elena Soto, and Jake Martinez."
104
+ )
105
+
106
+ # The annotation instances that take place in the text above are listed here
107
+ result = [
108
+ Mission(mention='Ares 3', date='2032', departure='Boca Chica', destination='Mars'),
109
+ Launcher(mention='Starship', space_company='SpaceX', crew=['Max Rutherford', 'Elena Soto', 'Jake Martinez'])
110
+ ]
111
+ ```
112
+
113
+ ## How to Get Started with the Model
114
+
115
+ Please read our [🚀 Example Jupyter Notebooks](https://github.com/hitz-zentroa/GoLLIE/tree/main/notebooks) to get started with GoLLIE.
116
+
117
+ The best way to load the model is using our custom `load_model` fuction. However, you can also load them using the AutoModelForCausalLM class.
118
+
119
+ **Important**: Our flash attention implementation has small numerical differences compared to the attention implementation in Huggingface.
120
+ You must use the flag `trust_remote_code=True` or you will get inferior results. Flash attention requires an available CUDA GPU. Running GOLLIE
121
+ pre-trained models on a CPU is not supported. We plan to address this in future releases. First, install flash attention 2:
122
+ ```bash
123
+ pip install flash-attn --no-build-isolation
124
+ pip install git+https://github.com/HazyResearch/flash-attention.git#subdirectory=csrc/rotary
125
+ ```
126
+
127
+ Then you can load the model using
128
+
129
+ ```python
130
+ import torch
131
+ from transformers import AutoTokenizer, AutoModelForCausalLM
132
+
133
+ tokenizer = AutoTokenizer.from_pretrained("HiTZ/GoLLIE-7B")
134
+ model = AutoModelForCausalLM.from_pretrained("HiTZ/GoLLIE-7B", trust_remote_code=True, torch_dtype=torch.bfloat16)
135
+ model.to("cuda")
136
+ ```
137
+
138
+ Read our [🚀 Example Jupyter Notebooks](https://github.com/hitz-zentroa/GoLLIE/tree/main/notebooks) to learn how to easily define guidelines, generate model inputs and parse the output!
139
+
140
+
141
+
142
+ ### Training Data
143
+
144
+ This is the list of task used for training and evaluating GoLLIE. However, as demonstrated in the 🚀 [Create Custom Task notebook](https://github.com/hitz-zentroa/GoLLIE/blob/main/notebooks/Create%20Custom%20Task.ipynb) GoLLIE can perform a wide range of unseen tasks.
145
+ For more info, read our [📖Paper]().
146
+
147
+ <p align="center">
148
+ <img src="https://github.com/hitz-zentroa/GoLLIE/raw/main/assets/datasets.png">
149
+ </p>
150
+
151
+
152
+ ## Evaluation
153
+
154
+ | Model | Supervised average F1 | Zero-shot average F1 | 🤗HuggingFace Hub |
155
+ |---|:---------------------:|:--------------------:|:---------------------------------------------------------:|
156
+ | GoLLIE-7B | 73.0 | 55.3 | [HiTZ/GoLLIE-7B](https://huggingface.co/HiTZ/GoLLIE-7B) |
157
+ | GoLLIE-13B | 73.9 | 56.0 | [HiTZ/GoLLIE-13B](https://huggingface.co/HiTZ/GoLLIE-13B) |
158
+ | GoLLIE-34B | **75.0** | **57.2** | [HiTZ/GoLLIE-34B](https://huggingface.co/HiTZ/GoLLIE-34B) |
159
+
160
+
161
+ ## Environmental Impact
162
+
163
+ | Model | Hardware | FLOPs | Time (h) | CO<sup>2</sup>eq (kg) |
164
+ |----------------|-------------------|---------------------------|-------------------|-------------------------------------|
165
+ | GoLLIE 7B | 1xA100 | 11.9e<sup>18</sup> | 44.5 | 1.57 |
166
+ | GoLLIE 13B | 1xA100 | 22.7e<sup>18</sup> | 79.5 | 2.80 |
167
+ | GoLLIE 34B | 2xA100 | 55.8e<sup>18</sup> | 94.6 | 6.67 |
168
+
169
+
170
+
171
+ ## Citation