license: cc-by-nc-2.0 | |
pipeline_tag: text-generation | |
inference: false | |
library_name: transformers | |
base_model: cognitivecomputations/laserxtral | |
tags: | |
- text-generation | |
This is just, SOTA 2 and 3-bit quants for laserxtral. Not much more to it. Meow. | |
The importance matrix, [which is generated from `group_10_merged.txt`](https://github.com/ggerganov/llama.cpp/discussions/5263#discussioncomment-8353685), is included in this repo, as `imatrix_laserxtral.dat`. | |
***UPDATE 2/11/2024***: The models have been reuploaded, with a new importance matrix used (`group_10_merged.txt` rather than `20k_random_data.txt`), which should in theory provide better performance. I'm not an expert, don't quote me on that. | |
## System Prompt | |
Alpaca format | |
``` | |
### Instruction: | |
... | |
### Input: | |
... | |
### Response: | |
``` | |
If you use LM Studio, this repo has a `model_config.json` you can import which has that pre-configured. | |