shantipriya commited on
Commit
def5fb5
1 Parent(s): 6aac614

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: gemma
3
+ language:
4
+ - or
5
+ - en
6
+ ---
7
+
8
+ # odia-gemma-2b-base (Pre-trained)
9
+
10
+ Odia-Gemma-2B-Base is a pre-trained Odia large language model with 2 billion parameters, and it is based on Google/Gemma 2B. The model is pre-trained on the Culturex-Odia dataset, a filtered version of the original CulturaX dataset for Odia text. The training dataset contains 49 million tokens. The CulturaX-Odia dataset is sourced from mc4 and four distinct OSCAR corpora.
11
+
12
+ For more details about the model, data, training procedure, and evaluations, go through the blog [post]().
13
+
14
+ ## Model Description
15
+ * Model type: A 2B pre-trained decoder-only model
16
+ * Primary Language(s): Odia and English
17
+ * License: Gemma Terms of Use
18
+
19
+ **NOTE**
20
+
21
+ This is not an instruction-tuned model, so it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model has no moderation mechanisms and may generate harmful or inappropriate responses.
22
+ It is recommended to first fine-tune it on the task(s) you are interested in.
23
+
24
+
25
+ ### Citation Information
26
+
27
+ If you find this model useful, please consider giving 👏 and citing:
28
+
29
+ ```
30
+ @misc{odia-gemma-2b-base,
31
+ author = {Sambit Sekhar and Shantipriya Parida and Debasish Dhal},
32
+ title = {OdiaGenAI Introduces Gemma 2B Pre-Trained LLM Catered to Odia Speakers},
33
+ year = {2024},
34
+ publisher = {Hugging Face},
35
+ journal = {Hugging Face repository},
36
+ howpublished = {\url{https://huggingface.co/OdiaGenAI}},
37
+ }
38
+ ```
39
+
40
+ ### Contributions
41
+
42
+ - Sambit Sekhar
43
+ - Shantipriya Parida
44
+ - Debasish Dhal