dyyyyyyyy commited on
Commit
97b4cc3
β€’
1 Parent(s): 87653dd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +82 -0
README.md CHANGED
@@ -1,3 +1,85 @@
1
  ---
2
  license: apache-2.0
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ datasets:
4
+ - Universal-NER/Pile-NER-type
5
+ - numind/NuNER
6
+ language:
7
+ - en
8
+ metrics:
9
+ - f1
10
+ library_name: transformers
11
+ pipeline_tag: text2text-generation
12
  ---
13
+
14
+ <p align="center"><h2 align="center">Rethinking Negative Instances for Generative Named Entity Recognition</h2></p>
15
+
16
+ # Model Card for GNER-T5-large
17
+
18
+ <!-- Provide a quick summary of what the model is/does. -->
19
+
20
+ We introduce GNER, a **G**enerative **N**amed **E**ntity **R**ecognition framework, which demonstrates enhanced zero-shot capabilities across unseen entity domains. Experiments on two representative generative models, i.e., LLaMA and Flan-T5, show that the integration of negative instances into the training process yields substantial performance enhancements. The resulting models, GNER-LLaMA and GNER-T5, outperform state-of-the-art (SoTA) approaches by a large margin, achieving improvements of 8 and 11 points in $F_1$ score, respectively. Code and models are publicly available.
21
+
22
+ * πŸ’» Code: [https://github.com/yyDing1/GNER/](https://github.com/yyDing1/GNER/)
23
+ * πŸ“– Paper: [Rethinking Negative Instances for Generative Named Entity Recognition](https://arxiv.org/abs/2402.16602)
24
+ * πŸ’Ύ Models in the πŸ€— HuggingFace Hub: [GNER-Models](https://huggingface.co/collections/dyyyyyyyy/gner-65dda2cb96c6e35c814dea56)
25
+ * πŸ§ͺ Reproduction Materials: [Reproduction Materials](https://drive.google.com/drive/folders/1m2FqDgItEbSoeUVo-i18AwMvBcNkZD46?usp=drive_link)
26
+ * 🎨 Example Jupyter Notebooks: [GNER Notebook](https://github.com/yyDing1/GNER/blob/main/notebook.ipynb)
27
+
28
+ <p align="center">
29
+ <img src="https://github.com/yyDing1/GNER/raw/main/assets/zero_shot_results.png">
30
+ </p>
31
+
32
+ ## PreTrained Models
33
+
34
+ We release five GNER models based on LLaMA (7B) and Flan-T5 (base, large, xl and xxl).
35
+
36
+ | Model | # Params | Zero-shot Average $F_1$ | Supervised Average $F_1$ | πŸ€— HuggingFace<br />Download Link |
37
+ | ---------------- | -------: | :----------------------: | :-----------------------: | :-------------------------------------------------: |
38
+ | GNER-LLaMA | 7B | 66.1 | 86.09 | [link](https://huggingface.co/dyyyyyyyy/GNER-LLaMA-7B) |
39
+ | GNER-T5-base | 248M | 59.5 | 83.21 | [link](https://huggingface.co/dyyyyyyyy/GNER-T5-base) |
40
+ | GNER-T5-large | 783M | 63.5 | 85.45 | [link](https://huggingface.co/dyyyyyyyy/GNER-T5-large) |
41
+ | GNER-T5-large-v2 | 783M | 63.5 | 85.45 | [link](https://huggingface.co/dyyyyyyyy/GNER-T5-large-v2) |
42
+ | GNER-T5-xl | 3B | 66.1 | 85.94 | [link](https://huggingface.co/dyyyyyyyy/GNER-T5-xl) |
43
+ | GNER-T5-xxl | 11B | 69.1 | 86.15 | [link](https://huggingface.co/dyyyyyyyy/GNER-T5-xxl) |
44
+
45
+ ## Demo usage
46
+
47
+ You should install the dependencies:
48
+ ```bash
49
+ pip install torch datasets deepspeed accelerate transformers protobuf
50
+ ```
51
+
52
+ Please check out [Example Jupyter Notebooks](https://github.com/yyDing1/GNER/blob/main/notebook.ipynb) for guidance on utilizing GNER models.
53
+
54
+ A simple inference example is as follows:
55
+
56
+ Below is an example using `GNER-T5`
57
+ ```python
58
+ >>> import torch
59
+ >>> from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
60
+ >>> tokenizer = AutoTokenizer.from_pretrained("dyyyyyyyy/GNER-T5-xxl")
61
+ >>> model = AutoModelForSeq2SeqLM.from_pretrained("dyyyyyyyy/GNER-T5-xxl", torch_dtype=torch.bfloat16).cuda()
62
+ >>> model = model.eval()
63
+ >>> instruction_template = "Please analyze the sentence provided, identifying the type of entity for each word on a token-by-token basis.\nOutput format is: word_1(label_1), word_2(label_2), ...\nWe'll use the BIO-format to label the entities, where:\n1. B- (Begin) indicates the start of a named entity.\n2. I- (Inside) is used for words within a named entity but are not the first word.\n3. O (Outside) denotes words that are not part of a named entity.\n"
64
+ >>> sentence = "did george clooney make a musical in the 1980s"
65
+ >>> entity_labels = ["genre", "rating", "review", "plot", "song", "average ratings", "director", "character", "trailer", "year", "actor", "title"]
66
+ >>> instruction = f"{instruction_template}\nUse the specific entity tags: {', '.join(entity_labels)} and O.\nSentence: {sentence}"
67
+ >>> inputs = tokenizer(instruction, return_tensors="pt").to("cuda")
68
+ >>> outputs = model.generate(**inputs, max_new_tokens=640)
69
+ >>> response = tokenizer.decode(outputs[0], skip_special_tokens=True)
70
+ >>> print(response)
71
+ "did(O) george(B-actor) clooney(I-actor) make(O) a(O) musical(B-genre) in(O) the(O) 1980s(B-year)"
72
+ ```
73
+
74
+ ## Citation
75
+
76
+ ```bibtex
77
+ @misc{ding2024rethinking,
78
+ title={Rethinking Negative Instances for Generative Named Entity Recognition},
79
+ author={Yuyang Ding and Juntao Li and Pinzheng Wang and Zecheng Tang and Bowen Yan and Min Zhang},
80
+ year={2024},
81
+ eprint={2402.16602},
82
+ archivePrefix={arXiv},
83
+ primaryClass={cs.CL}
84
+ }
85
+ ```