Edit model card

DaringLotus-10.7B 5bpw EXL2

Description

EXL2 quant of BlueNipples/DaringLotus-10.7B

  • 6bpw should be comfortable on 12 gb with 8k context
  • 4bpw might just fit on 8gb of vram at 4k context
  • if you have more ram get the 8bpw

Other quants:

EXL2: 8bpw, 6bpw, 5bpw, 4bpw

Prompt Format

Alpaca:

I am not entirely certain of this but i think alpaca is correct for this model

Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{prompt}

### Input:
{input}

### Response:

Contact

Kooten on discord

Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.