FluffyKaeloky's picture
Update README.md
63650ce verified
|
raw
history blame contribute delete
No virus
694 Bytes
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge
MidnightMiqu

Midnight-Miqu-103B-v1.5-exl2-4.0bpw-rpcal

This is a 4.0bpw EXL2 quant of FluffyKaeloky/Midnight-Miqu-103B-v1.5

The pippa file used for calibration is optimised for roleplay. The measurement file can be found in the files if you want to do your own quants.

Details about the model and the merge info can be found at the fp16 model link above.