TKDKid1000 commited on
Commit
3037c36
1 Parent(s): c2fc08c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -0
README.md CHANGED
@@ -10,6 +10,23 @@ tags:
10
  - nlp
11
  - code
12
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  ## Model Summary
14
 
15
  The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.
 
10
  - nlp
11
  - code
12
  ---
13
+ # Phi-1.5 - GGUF
14
+ - Model creator: [Microsoft](https://huggingface.co/microsoft)
15
+ - Original model: [Phi 1.5](https://huggingface.co/microsoft/phi-1_5)
16
+
17
+ ## Description
18
+
19
+ This repo contains GGUF format model files for [Microsoft's Phi 1.5](https://huggingface.co/microsoft/phi-1_5).
20
+
21
+ ## Prompt template: Phi
22
+
23
+ ```
24
+ Instruct: {prompt}
25
+ Output:
26
+ ```
27
+
28
+ # Original model card: Microsoft's Phi 1.5
29
+
30
  ## Model Summary
31
 
32
  The language model Phi-1.5 is a Transformer with **1.3 billion** parameters. It was trained using the same data sources as [phi-1](https://huggingface.co/microsoft/phi-1), augmented with a new data source that consists of various NLP synthetic texts. When assessed against benchmarks testing common sense, language understanding, and logical reasoning, Phi-1.5 demonstrates a nearly state-of-the-art performance among models with less than 10 billion parameters.