Text Generation
Transformers
mistral
conversational
Inference Endpoints
text-generation-inference
4-bit precision
gptq
Mika-7B-GPTQ / README.md
Epiculous's picture
Update README.md
9f34681 verified
metadata
license: agpl-3.0
datasets:
  - lemonilia/LimaRP
  - grimulkan/theory-of-mind
  - Epiculous/Gnosis
  - ChaoticNeutrals/Synthetic-RP
  - ChaoticNeutrals/Synthetic-Dark-RP

Mika (Named after what my Claude-3 Opus chat called itself.) is a Model trained in a similar manner to Fett-uccine with synthetic RP data created by Claude also included.

Format

I've had the best results with ChatML Context Template and Mistral Instruct Template, however, YMMV.