CyberNative commited on
Commit
de666a5
·
1 Parent(s): c4083fa

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -5,9 +5,9 @@ license: llama2
5
 
6
  ## THIS IS A PLACEHOLDER, MODEL COMMING SOON
7
 
8
- # CyberBase 8k - (llama-2-13b - lmsys/vicuna-13b-v1.5-16k)
9
 
10
- Base cybersecurity model for future fine-tuning, it is not recomended to use on it's own.
11
  - **CyberBase** is a [lmsys/vicuna-13b-v1.5-16k](https://huggingface.co/lmsys/vicuna-13b-v1.5-16k) QLORA fine-tuned on [CyberNative/github_cybersecurity_READMEs](https://huggingface.co/datasets/CyberNative/github_cybersecurity_READMEs)
12
  - It might, therefore, inherit [promp template of FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md#prompt-template)
13
  - **sequence_len:** 8192
 
5
 
6
  ## THIS IS A PLACEHOLDER, MODEL COMMING SOON
7
 
8
+ CyberBase 13b 8k *base model* - (llama-2-13b - lmsys/vicuna-13b-v1.5-16k)
9
 
10
+ # Base cybersecurity model for future fine-tuning, it is not recomended to use on it's own.
11
  - **CyberBase** is a [lmsys/vicuna-13b-v1.5-16k](https://huggingface.co/lmsys/vicuna-13b-v1.5-16k) QLORA fine-tuned on [CyberNative/github_cybersecurity_READMEs](https://huggingface.co/datasets/CyberNative/github_cybersecurity_READMEs)
12
  - It might, therefore, inherit [promp template of FastChat](https://github.com/lm-sys/FastChat/blob/main/docs/vicuna_weights_version.md#prompt-template)
13
  - **sequence_len:** 8192