avemio-digital commited on
Commit
f7d4f88
verified
1 Parent(s): 5a894c0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -21
README.md CHANGED
@@ -27,22 +27,21 @@ tags:
27
  <img src="https://www.grag.ai/wp-content/uploads/2024/12/GRAG-ICON-TO-WORDLOGO-Animation_Loop-small-ezgif.com-video-to-gif-converter.gif" alt="GRAG Logo" width="400" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
28
 
29
 
30
- # GRAG-PHI-3.5-MINI-4B-SFT-HESSIAN-AI
31
 
32
  <!-- Provide a quick summary of what the model is/does. -->
33
 
34
  **GRAG** (**G**erman **R**etrieval **A**ugmented **G**eneration) models are designed for the German-speaking market, enabling innovation and AI solutions to drive German research collaboration in business-focused Generative AI by 2025
35
 
36
- Our GRAG-PHI-SFT model are trained on this **[GRAG-SFT](https://huggingface.co/datasets/avemio/GRAG-SFT-ShareGPT-HESSIAN-AI) dataset.**
37
 
38
  ## Model Details
39
 
40
  The core models released in this batch are the following:
41
  | Size | Training Tokens |
42
  |------|--------|
43
- | [GRAG-Phi-CPT](https://huggingface.co/avemio/GRAG-PHI-3.5-MINI-4B-CPT-HESSIAN-AI) | 507.47 million |
44
- | [GRAG-Phi-SFT](https://huggingface.co/avemio/GRAG-PHI-3.5-MINI-4B-SFT-HESSIAN-AI) | 2.03 billion |
45
- | [GRAG-Phi-ORPO](https://huggingface.co/avemio/GRAG-PHI-3.5-MINI-4B-ORPO-HESSIAN-AI) | 2.0577 billion |
46
 
47
  ### Model Description
48
 
@@ -52,23 +51,9 @@ The core models released in this batch are the following:
52
  - **Supported by:** Hessian AI
53
  - **Model type:** a Transformer style autoregressive language model.
54
  - **Language(s) (NLP):** German, English
55
- - **License:** The code and model are released under Apache 2.0.
56
  - **Contact:** [grag@avemio.digital](mailto:grag@avemio.digital)
57
 
58
-
59
- ### Model Sources
60
-
61
- <!-- Provide the basic links for the model. -->
62
-
63
- - **Project Page:**
64
- - **Repositories:**
65
- - Training:
66
- - Evaluation code:
67
- - **Technical blog post:**
68
- <!-- - **Press release:** TODO -->
69
-
70
-
71
-
72
  ## Merge Details
73
  ### Merge Method
74
 
@@ -113,7 +98,7 @@ Now, proceed as usual with HuggingFace:
113
  ```python
114
  from transformers import AutoModelForCausalLM, AutoTokenizer
115
 
116
- model_name = "avemio/GRAG-PHI-3.5-MINI-4B-SFT-HESSIAN-AI"
117
 
118
  model = AutoModelForCausalLM.from_pretrained(
119
  model_name,
 
27
  <img src="https://www.grag.ai/wp-content/uploads/2024/12/GRAG-ICON-TO-WORDLOGO-Animation_Loop-small-ezgif.com-video-to-gif-converter.gif" alt="GRAG Logo" width="400" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
28
 
29
 
30
+ # GRAG-PHI-3.5-MINI-4B-MERGED-HESSIAN-AI
31
 
32
  <!-- Provide a quick summary of what the model is/does. -->
33
 
34
  **GRAG** (**G**erman **R**etrieval **A**ugmented **G**eneration) models are designed for the German-speaking market, enabling innovation and AI solutions to drive German research collaboration in business-focused Generative AI by 2025
35
 
 
36
 
37
  ## Model Details
38
 
39
  The core models released in this batch are the following:
40
  | Size | Training Tokens |
41
  |------|--------|
42
+ | [GRAG-PHI-CPT](https://huggingface.co/avemio/GRAG-PHI-3.5-MINI-4B-CPT-HESSIAN-AI) | 507.47 million |
43
+ | [GRAG-PHI-SFT](https://huggingface.co/avemio/GRAG-PHI-3.5-MINI-4B-SFT-HESSIAN-AI) | 2.03 billion |
44
+ | [GRAG-PHI-ORPO](https://huggingface.co/avemio/GRAG-PHI-3.5-MINI-4B-ORPO-HESSIAN-AI) | 2.0577 billion |
45
 
46
  ### Model Description
47
 
 
51
  - **Supported by:** Hessian AI
52
  - **Model type:** a Transformer style autoregressive language model.
53
  - **Language(s) (NLP):** German, English
54
+ - **License:** The code and model are released under MIT.
55
  - **Contact:** [grag@avemio.digital](mailto:grag@avemio.digital)
56
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
57
  ## Merge Details
58
  ### Merge Method
59
 
 
98
  ```python
99
  from transformers import AutoModelForCausalLM, AutoTokenizer
100
 
101
+ model_name = "avemio/GRAG-PHI-3.5-MINI-4B-MERGED-HESSIAN-AI"
102
 
103
  model = AutoModelForCausalLM.from_pretrained(
104
  model_name,