Beagle14-7B-GGUF / README.md
brittlewis12's picture
Create README.md
6f32935 verified
|
raw
history blame
3.46 kB
metadata
base_model: mlabonne/Beagle14-7B
inference: false
language:
  - en
license: apache-2.0
model_creator: mlabonne
model_name: Beagle14-7B
model_type: mistral
pipeline_tag: text-generation
prompt_template: |
  <|system|>
  </s>
  <|user|>
  {prompt}</s>
  <|assistant|>
tags:
  - merge
  - mergekit
  - lazymergekit
  - fblgit/UNA-TheBeagle-7b-v1
  - argilla/distilabeled-Marcoro14-7B-slerp
quantized_by: brittlewis12

Beagle14-7B GGUF

Original model: Beagle14-7B Model creator: Maxime Labonne

This repo contains GGUF format model files for Maxime Labonne’s Beagle14-7B.

Beagle14-7B is a merge of the following models using LazyMergekit:

What is GGUF?

GGUF is a file format for representing AI models. It is the third version of the format, introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp. Converted using llama.cpp build 1879 (revision 3e5ca79)

Prompt template: Zephyr

Zephyr-style appears to work well!

<|system|>
{{system_message}}</s>
<|user|>
{{prompt}}</s>
<|assistant|>

Download & run with cnvrs on iPhone, iPad, and Mac!

cnvrs is the best app for private, local AI on your device:

  • create & save Characters with custom system prompts & temperature settings
  • download and experiment with any GGUF model you can find on HuggingFace!
  • make it your own with custom Theme colors
  • powered by Metal ⚡️ & Llama.cpp, with haptics during response streaming!
  • try it out yourself today, on Testflight!

Original Model Evaluations:

The evaluation was performed by the model’s creator using LLM AutoEval on Nous suite, as reported from mlabonne’s alternative leaderboard, YALL: Yet Another LLM Leaderboard.

Model AGIEval GPT4All TruthfulQA Bigbench Average
Beagle14-7B 44.38 76.53 69.44 47.25 59.4
OpenHermes-2.5-Mistral-7B 42.75 72.99 52.99 40.94 52.42
NeuralHermes-2.5-Mistral-7B 43.67 73.24 55.37 41.76 53.51
Nous-Hermes-2-SOLAR-10.7B 47.79 74.69 55.92 44.84 55.81
Marcoro14-7B-slerp 44.66 76.24 64.15 45.64 57.67
CatMarcoro14-7B-slerp 45.21 75.91 63.81 47.31 58.06