jwieczorekhabana
commited on
Commit
•
98d2604
1
Parent(s):
d0b8563
Update README.md
Browse files
README.md
CHANGED
@@ -21,7 +21,7 @@ This enables to specify:
|
|
21 |
|
22 |
The model is instantiated the same way as in the Transformers library.
|
23 |
The only difference is that there are a few new training arguments specific to HPUs.\
|
24 |
-
|
25 |
|
26 |
[Here](https://github.com/huggingface/optimum-habana/blob/main/examples/question-answering/run_qa.py) is a question-answering example script to fine-tune a model on SQuAD. You can run it with DistilBERT with the following command:
|
27 |
```bash
|
|
|
21 |
|
22 |
The model is instantiated the same way as in the Transformers library.
|
23 |
The only difference is that there are a few new training arguments specific to HPUs.\
|
24 |
+
It is strongly recommended to train this model doing bf16 mixed-precision training for optimal performance and accuracy.
|
25 |
|
26 |
[Here](https://github.com/huggingface/optimum-habana/blob/main/examples/question-answering/run_qa.py) is a question-answering example script to fine-tune a model on SQuAD. You can run it with DistilBERT with the following command:
|
27 |
```bash
|