|
--- |
|
library_name: peft |
|
--- |
|
# MaralGPT Mistral 7B version 0.1 |
|
|
|
_MaralGPT_ is a _Larage Language Model_ project launched in middle of the year 2023 by Iranian entrepreneur and engineer [Muhammadreza Haghiri](https://haghiri75.com/en) as an attempt to improve the AI for Persian language and also it is a part of [Mann-E](https://manne.ir) project. |
|
|
|
## Training procedure |
|
|
|
|
|
The following `bitsandbytes` quantization config was used during training: |
|
- quant_method: gptq |
|
- bits: 4 |
|
- tokenizer: None |
|
- dataset: None |
|
- group_size: 128 |
|
- damp_percent: 0.1 |
|
- desc_act: True |
|
- sym: True |
|
- true_sequential: True |
|
- use_cuda_fp16: False |
|
- model_seqlen: None |
|
- block_name_to_quantize: None |
|
- module_name_preceding_first_block: None |
|
- batch_size: 1 |
|
- pad_token_id: None |
|
- disable_exllama: False |
|
- max_input_length: None |
|
### Framework versions |
|
|
|
|
|
- PEFT 0.5.0 |
|
|