danielpark
commited on
Commit
•
92bac44
1
Parent(s):
d4e0fa0
Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,7 @@ library_name: bitsandbytes, transformers, peft, accelerate, bitsandbytes, datase
|
|
7 |
pipeline_tag: text-generation
|
8 |
---
|
9 |
|
10 |
-
# The deployment of the multimodal LLM project for commercial purposes is prioritized, so the results and weights of GORANI project will no longer be updated.
|
11 |
|
12 |
|
13 |
# GORANI 100k
|
@@ -22,7 +22,7 @@ KORANI is derived from GORANI, a project within llama2 that experiments with the
|
|
22 |
- We are currently conducting experiments using various techniques such as max sequence length, rope scaling, attention sinks, and flash attention 2.
|
23 |
- Please do not use the current model weights as they are not useful.
|
24 |
The most stringent non-commercial use license (CC-BY-NC-4.0) among the licenses of the datasets used for training is also applied to the model weights.
|
25 |
-
- On 2023-11-12, it was decided that all projects would be kept private.
|
26 |
|
27 |
<br>
|
28 |
|
|
|
7 |
pipeline_tag: text-generation
|
8 |
---
|
9 |
|
10 |
+
# The deployment of the multimodal LLM project for commercial purposes is prioritized, so the results and weights of GORANI project will no longer be updated.
|
11 |
|
12 |
|
13 |
# GORANI 100k
|
|
|
22 |
- We are currently conducting experiments using various techniques such as max sequence length, rope scaling, attention sinks, and flash attention 2.
|
23 |
- Please do not use the current model weights as they are not useful.
|
24 |
The most stringent non-commercial use license (CC-BY-NC-4.0) among the licenses of the datasets used for training is also applied to the model weights.
|
25 |
+
- On 2023-11-12, it was decided that all projects would be kept private. (It may be released in a non-public model format on cloud platforms by 2024.)
|
26 |
|
27 |
<br>
|
28 |
|