ThenMagician commited on
Commit
047f96f
1 Parent(s): 76cb978

Upload 2 files

Browse files
Files changed (2) hide show
  1. README.md +17 -0
  2. model-00003-of-00003.safetensors +3 -0
README.md ADDED
@@ -0,0 +1,17 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Arconte-13B
2
+
3
+ Arconte is Llama-2 merge. Arconte has many iterations, trying different recipes/models/merge-methods, in particular, iteration I and iteration Z are both models which showed promise. This version of Arconte is variation I redone with a more experimental approach to the merge recipe, and it shows great results.
4
+
5
+ Originally, the idea was to do one of those fancy Dare Ties model A, Dare Ties model B, Slerp model A + model B. I already did the Slerp model C, but it's flawed due to the flawed iterations I and Z. I still plan to do model C, so now I am remaking iteration Z.
6
+
7
+ Models used:
8
+
9
+ [NeverSleep/X-NoroChronos-13B](https://huggingface.co/NeverSleep/X-NoroChronos-13B)
10
+
11
+ [Undi95/Emerhyst-13B](https://huggingface.co/Undi95/Emerhyst-13B)
12
+
13
+ [Henk717/echidna-tiefigther-25](https://huggingface.co/Henk717/echidna-tiefigther-25)
14
+
15
+ After completing model C, current roadmap is to either go into mistral merges, or trying my hand at making loras/qloras. No mixtral, nor anything above 13B parameters in the future due to hardware limitations.
16
+
17
+ All testing was done with Q5_K_M GUFF. I'll upload the full GUFF range along with an Imatrix version soon.
model-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aa8bc69d4e13ebe4bf2b591bcfdcab34f73ef8f62642c666bacf22363ed82f4c
3
+ size 6163240512