retiredcarboxyl
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -7,10 +7,10 @@ tags:
|
|
7 |
- gemma
|
8 |
---
|
9 |
|
10 |
-
#
|
11 |
|
12 |
## Description
|
13 |
-
The
|
14 |
|
15 |
## Model Information
|
16 |
- **Model size**: 180 billion parameters
|
@@ -18,7 +18,7 @@ The HelpingAI-180B-base model is a large-scale artificial intelligence model dev
|
|
18 |
- **Training objective**: Language modeling with an emphasis on understanding and generating human-like text.
|
19 |
- **Tokenizer**: Gemma tokenizer
|
20 |
## Intended Use
|
21 |
-
The
|
22 |
- Text generation
|
23 |
- Language understanding
|
24 |
- Text summarization
|
|
|
7 |
- gemma
|
8 |
---
|
9 |
|
10 |
+
# Chat2Eco-180B-base
|
11 |
|
12 |
## Description
|
13 |
+
The Chat2Eco-180B-base model is a large-scale artificial intelligence model developed to assist in various natural language processing tasks. Trained on a diverse range of data sources, this model is designed to generate text, facilitate language understanding, and support various downstream tasks.
|
14 |
|
15 |
## Model Information
|
16 |
- **Model size**: 180 billion parameters
|
|
|
18 |
- **Training objective**: Language modeling with an emphasis on understanding and generating human-like text.
|
19 |
- **Tokenizer**: Gemma tokenizer
|
20 |
## Intended Use
|
21 |
+
The Chat2Eco-180B-base model is intended for researchers, developers, and practitioners in the field of natural language processing (NLP). It can be used for a variety of tasks, including but not limited to:
|
22 |
- Text generation
|
23 |
- Language understanding
|
24 |
- Text summarization
|