Merged-AGI-7B-EXL2 / README.md
hgloow's picture
Update README.md
bbe25c0
metadata
license: cc-by-nc-4.0
datasets:
  - meta-math/MetaMathQA
language:
  - en
pipeline_tag: text-generation
tags:
  - Math
  - exl2

Merged-AGI-7B

EXL2 Quants

Zipped Quantization (if you want to download a single file)

Calibration Dataset

wikitext-103-v1

Memory Usage

Measured using ExLlamav2_HF and 4096 max_seq_len with Oobabooga's Text Generation WebUI.

Branch BPW VRAM Usage Description
3.0bpw 3.0 3.7 GB For >=6GB VRAM cards
4.0bpw (main) 4.0 4.4 GB For >=6GB VRAM cards
6.0bpw 6.0 6.1 GB For >=8GB VRAM cards
8.0bpw 8.0 7.7 GB For >=10GB VRAM cards

Prompt template: ChatML

<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

Original Info

Merge Q-bert/MetaMath-Cybertron-Starling and fblgit/juanako-7b-UNA using slerp merge.

You can use ChatML format.

Open LLM Leaderboard Evaluation Results

Detailed results can be found Coming soon

Metric Value
Avg. Coming soon
ARC (25-shot) Coming soon
HellaSwag (10-shot) Coming soon
MMLU (5-shot) Coming soon
TruthfulQA (0-shot) Coming soon
Winogrande (5-shot) Coming soon
GSM8K (5-shot) Coming soon