File size: 710 Bytes
34ccb80
 
f8c09fb
 
 
34ccb80
 
c6620c1
34ccb80
09b89a2
 
 
 
950b820
 
 
f496b5c
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
---
library_name: transformers
license: apache-2.0
datasets:
- netcat420/MFANN
---

I am now basing all future releases of the MFANN experiment using llama-3 as a base model, I may continue fine-tuning mistral-7b every other release

this model uses meta's llama-3 as its base, and benchmarks are pending


![image/png](https://cdn-uploads.huggingface.co/production/uploads/6435f27b2d0ed796668ffd8b/VlqyDezfgqoujwIdiNfYB.png)


changed the model name to MFANNV0.6 due to a failed benchmark and the need to resubmit

edit: due to continuous benchmark fails I am renaming the model back to MFANNv0.6, the 3b model is also failing benchmarks for some reason despite the fact both models run fine on my machine :(