Edit model card
MidnightMiqu

Midnight-Miqu-103B-v1.5-exl2-4.0bpw-rpcal

This is a 4.0bpw EXL2 quant of FluffyKaeloky/Midnight-Miqu-103B-v1.5

The pippa file used for calibration is optimised for roleplay. The measurement file can be found in the files if you want to do your own quants.

Details about the model and the merge info can be found at the fp16 model link above.

Downloads last month
0
Inference API
Input a message to start chatting with FluffyKaeloky/Midnight-Miqu-103B-v1.5-exl2-4.0bpw-rpcal.
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.
Invalid base_model specified in model card metadata. Needs to be a model id from hf.co/models.