|
--- |
|
license: cc-by-nc-nd-4.0 |
|
base_model: |
|
- mistralai/Mixtral-8x7B-Instruct-v0.1 |
|
tags: |
|
- dpo |
|
- merge |
|
- mergekit |
|
--- |
|
# rizla been cooking while singing |
|
# This is an experimental model that I made by merging two 2expmixtrals The mergekitty is a tool that lets me mix and match different models into one big model, keeping all the smarts and skills of the original models. The llama70b is a huge language model that can make words for all kinds of things and ways, based on the GPT-4 thingy. |
|
|
|
The merged model has 17 billion parraraameters and was made to run on 8gb of ram minimum in q3KL gguf. |
|
|
|
## Merge me baby one more time |
|
### Sending this contraption out straight to mergeland, wwhheeeeeeeeeeeee LFG ๐ |