brishtiteveja
commited on
Commit
•
ed4b0e6
1
Parent(s):
4b89599
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,66 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- bn
|
4 |
+
- en
|
5 |
+
license: llama2
|
6 |
+
---
|
7 |
+
|
8 |
+
# Bangla LLaMA 7B Instruct v0.1 - GGUF Format (For use in LM Studio)
|
9 |
+
|
10 |
+
Welcome to the inaugural release of the Bangla LLaMA 7B instruct model – an important step in advancing LLMs for the Bangla language. This model is ready for immediate inference and is also primed for further fine-tuning to cater to your specific NLP tasks.
|
11 |
+
|
12 |
+
## Model description
|
13 |
+
|
14 |
+
The Bangla LLaMA models have been enhanced and tailored specifically with an extensive Bangla vocabulary of 16,000 tokens, building upon the foundation set by the original LLaMA-2.
|
15 |
+
|
16 |
+
- **Model type:** A 7B parameter GPT-like model fine-tuned on [Bangla-Alpaca-Orca](https://huggingface.co/datasets/BanglaLLM/Bangla-alpaca-orca) - a mix of Bangla-translated [Stanford-Alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca) and a subset of [OpenOrca](https://huggingface.co/datasets/Open-Orca/OpenOrca) datasets.
|
17 |
+
- **Language(s):** Bangla and English
|
18 |
+
- **License:** GNU General Public License v3.0
|
19 |
+
- **Finetuned from model:** [BanglaLLM/Bangla-llama-7b-base-v0.1](https://huggingface.co/BanglaLLM/Bangla-llama-7b-base-v0.1)
|
20 |
+
- **Training Precision:** `float16`
|
21 |
+
- **Code:** [GitHub](https://github.com/BanglaLLM/Bangla-llama)
|
22 |
+
|
23 |
+
## Prompting Format
|
24 |
+
|
25 |
+
**Prompt Template Without Input**
|
26 |
+
|
27 |
+
```
|
28 |
+
{system_prompt}
|
29 |
+
|
30 |
+
### Instruction:
|
31 |
+
{instruction or query}
|
32 |
+
|
33 |
+
### Response:
|
34 |
+
{response}
|
35 |
+
```
|
36 |
+
|
37 |
+
**Prompt Template With Input**
|
38 |
+
|
39 |
+
```
|
40 |
+
{system_prompt}
|
41 |
+
|
42 |
+
### Instruction:
|
43 |
+
{instruction or query}
|
44 |
+
|
45 |
+
### Input:
|
46 |
+
{input}
|
47 |
+
|
48 |
+
### Response:
|
49 |
+
{response}
|
50 |
+
```
|
51 |
+
|
52 |
+
## Usage Note
|
53 |
+
|
54 |
+
It's important to note that the models have not undergone detoxification. Therefore, while they possess impressive linguistic capabilities, there is a possibility for them to generate content that could be deemed harmful or offensive. We urge users to exercise discretion and supervise the model's outputs closely, especially in public or sensitive applications.
|
55 |
+
|
56 |
+
## Meet the Developers
|
57 |
+
|
58 |
+
Get to know the creators behind this innovative model and follow their contributions to the field:
|
59 |
+
|
60 |
+
- [Abdullah Khan Zehady](https://www.linkedin.com/in/abdullah-khan-zehady-915ba024/)
|
61 |
+
|
62 |
+
## Citation
|
63 |
+
|
64 |
+
|
65 |
+
|
66 |
+
We hope this model serves as a valuable tool in your NLP toolkit and look forward to seeing the advancements it will enable in the understanding and generation of the Bangla language.
|