File size: 402 Bytes
5d6c2b4
 
 
 
 
 
 
 
1ec2b46
5d6c2b4
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
---
language:
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- text-generation-inference
- merge
license: apache-2.0
---
quant of [brucethemoose's](https://huggingface.co/brucethemoose) [Yi-34B-200K-DARE-merge-v5](https://huggingface.co/brucethemoose/Yi-34B-200K-DARE-merge-v5)

fits into 24gb with 16k context on windows

pippa_cleaned used for calibration, with 8192 token length