File size: 840 Bytes
cc151c5
 
 
 
16c77a2
d2466b8
 
cc151c5
d2466b8
cc151c5
 
16c77a2
d2466b8
16c77a2
d2466b8
 
e9a3589
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
tags:
- autotrain
- text-generation
base_model: ahxt/llama2_xs_460M_experimental
datasets:
- KnutJaegersberg/WizardLM_evol_instruct_V2_196k_instruct_format
widget:
- text: "Once upon a time,"
---

# ahxt's llama2_xs_460M_experimental trained on the WizardLM's Evol Instruct dataset using AutoTrain

- Base model: [ahxt/llama2_xs_460M_experimental](https://huggingface.co/ahxt/llama2_xs_460M_experimental)
- Dataset: [KnutJaegersberg/WizardLM_evol_instruct_V2_196k_instruct_format](https://huggingface.co/datasets/KnutJaegersberg/WizardLM_evol_instruct_V2_196k_instruct_format)
- Training parameters: [training_params.json](https://huggingface.co/Felladrin/llama2_xs_460M_experimental_evol_instruct/blob/cc151c5669ea37c3ef972e375c74f2d9bfd92b49/training_params.json)

Note: Only the Lora Adapter for now. Merged base model is on the way.