Update README.md
Browse files
README.md
CHANGED
@@ -72,6 +72,23 @@ answer = tokenizer.decode(tokenizer.convert_tokens_to_ids(answer_tokens))
|
|
72 |
|
73 |
# output => democratized NLP
|
74 |
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
75 |
If given the same context we ask something that is not there, the output for **no answer** will be ```<s>```
|
76 |
|
77 |
> Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488) | [LinkedIn](https://www.linkedin.com/in/manuel-romero-cs/)
|
|
|
72 |
|
73 |
# output => democratized NLP
|
74 |
```
|
75 |
+
|
76 |
+
## Usage with HF `pipleine`
|
77 |
+
```python
|
78 |
+
from transformers import AutoTokenizer, AutoModelForQuestionAnswering, pipeline
|
79 |
+
|
80 |
+
ckpt = "mrm8488/longformer-base-4096-finetuned-squadv2"
|
81 |
+
tokenizer = AutoTokenizer.from_pretrained(ckpt)
|
82 |
+
model = AutoModelForQuestionAnswering.from_pretrained(ckpt)
|
83 |
+
|
84 |
+
qa = pipeline({"question-answering", model=model, tokenizer=tokenizer)
|
85 |
+
|
86 |
+
text = "Huggingface has democratized NLP. Huge thanks to Huggingface for this."
|
87 |
+
question = "What has Huggingface done?"
|
88 |
+
|
89 |
+
qa({"question": question, "context": text})
|
90 |
+
```
|
91 |
+
|
92 |
If given the same context we ask something that is not there, the output for **no answer** will be ```<s>```
|
93 |
|
94 |
> Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488) | [LinkedIn](https://www.linkedin.com/in/manuel-romero-cs/)
|