phi-2-imatrix-GGUF / README.md
duyntnet's picture
Update README.md
7268b3f verified
|
raw
history blame
No virus
2.57 kB
---
license: other
inference: false
language:
- en
pipeline_tag: text-generation
tags:
- transformers
- microsoft
- gguf
- imatrix
- phi-2
---
Quantizations of https://huggingface.co/microsoft/phi-2
# From original readme
## How to Use
Phi-2 was integrated in `transformers` version 4.37. If you need to use an earlier version, you need to pass `trust_remote_code=True` to the `from_pretrained()` function.
Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the [PhiAttention.forward()](https://huggingface.co/microsoft/phi-2/blob/main/modeling_phi.py#L306) function.
## Intended Uses
Given the nature of the training data, the Phi-2 model is best suited for prompts using the QA format, the chat format, and the code format.
### QA Format:
You can provide the prompt as a standalone question as follows:
```markdown
Write a detailed analogy between mathematics and a lighthouse.
```
where the model generates the text after "." .
To encourage the model to write more concise answers, you can also try the following QA format using "Instruct: \<prompt\>\nOutput:"
```markdown
Instruct: Write a detailed analogy between mathematics and a lighthouse.
Output: Mathematics is like a lighthouse. Just as a lighthouse guides ships safely to shore, mathematics provides a guiding light in the world of numbers and logic. It helps us navigate through complex problems and find solutions. Just as a lighthouse emits a steady beam of light, mathematics provides a consistent framework for reasoning and problem-solving. It illuminates the path to understanding and helps us make sense of the world around us.
```
where the model generates the text after "Output:".
### Chat Format:
```markdown
Alice: I don't know why, I'm struggling to maintain focus while studying. Any suggestions?
Bob: Well, have you tried creating a study schedule and sticking to it?
Alice: Yes, I have, but it doesn't seem to help much.
Bob: Hmm, maybe you should try studying in a quiet environment, like the library.
Alice: ...
```
where the model generates the text after the first "Bob:".
### Code Format:
```python
def print_prime(n):
"""
Print all primes between 1 and n
"""
primes = []
for num in range(2, n+1):
is_prime = True
for i in range(2, int(math.sqrt(num))+1):
if num % i == 0:
is_prime = False
break
if is_prime:
primes.append(num)
print(primes)
```
where the model generates the text after the comments.