Update README.md
Browse files
README.md
CHANGED
@@ -2,4 +2,20 @@
|
|
2 |
license: mit
|
3 |
language:
|
4 |
- en
|
5 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
license: mit
|
3 |
language:
|
4 |
- en
|
5 |
+
---
|
6 |
+
|
7 |
+
## Model Summary
|
8 |
+
|
9 |
+
The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
|
10 |
+
|
11 |
+
We've trained Microsoft Research's phi-1.5, 1.3B parameter model with chat datasets.
|
12 |
+
|
13 |
+
## How to Use
|
14 |
+
|
15 |
+
Phi-1.5 has been integrated in the `transformers` version 4.37.0. If you are using a lower version, ensure that you are doing the following:
|
16 |
+
|
17 |
+
* When loading the model, ensure that `trust_remote_code=True` is passed as an argument of the `from_pretrained()` function.
|
18 |
+
|
19 |
+
The current `transformers` version can be verified with: `pip list | grep transformers`.
|
20 |
+
|
21 |
+
|