Generated README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,85 @@
|
|
1 |
---
|
|
|
|
|
|
|
|
|
2 |
license: apache-2.0
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
tags:
|
3 |
+
- llm-rs
|
4 |
+
- ggml
|
5 |
+
pipeline_tag: text-generation
|
6 |
license: apache-2.0
|
7 |
+
language:
|
8 |
+
- en
|
9 |
+
datasets:
|
10 |
+
- togethercomputer/RedPajama-Data-1T
|
11 |
---
|
12 |
+
# GGML converted versions of [Together](https://huggingface.co/togethercomputer)'s RedPajama models
|
13 |
+
|
14 |
+
# RedPajama-INCITE-7B-Base
|
15 |
+
|
16 |
+
RedPajama-INCITE-7B-Base was developed by Together and leaders from the open-source AI community including Ontocord.ai, ETH DS3Lab, AAI CERC, Université de Montréal, MILA - Québec AI Institute, Stanford Center for Research on Foundation Models (CRFM), Stanford Hazy Research research group and LAION.
|
17 |
+
The training was done on 3,072 V100 GPUs provided as part of the INCITE 2023 project on Scalable Foundation Models for Transferrable Generalist AI, awarded to MILA, LAION, and EleutherAI in fall 2022, with support from the Oak Ridge Leadership Computing Facility (OLCF) and INCITE program.
|
18 |
+
|
19 |
+
- Base Model: [RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base)
|
20 |
+
- Instruction-tuned Version: [RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct)
|
21 |
+
- Chat Version: [RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat)
|
22 |
+
|
23 |
+
|
24 |
+
## Model Details
|
25 |
+
- **Developed by**: Together Computer.
|
26 |
+
- **Model type**: Language Model
|
27 |
+
- **Language(s)**: English
|
28 |
+
- **License**: Apache 2.0
|
29 |
+
- **Model Description**: A 6.9B parameter pretrained language model.
|
30 |
+
|
31 |
+
## Converted Models:
|
32 |
+
|
33 |
+
| Name | Based on | Type | Container | GGML Version |
|
34 |
+
|:--------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------|:-------|:------------|:---------------|
|
35 |
+
| [RedPajama-INCITE-7B-Base-f16.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Base-f16.bin) | [togethercomputer/RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base) | F16 | GGML | V3 |
|
36 |
+
| [RedPajama-INCITE-7B-Base-q4_0.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Base-q4_0.bin) | [togethercomputer/RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base) | Q4_0 | GGML | V3 |
|
37 |
+
| [RedPajama-INCITE-7B-Base-q4_0-ggjt.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Base-q4_0-ggjt.bin) | [togethercomputer/RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base) | Q4_0 | GGJT | V3 |
|
38 |
+
| [RedPajama-INCITE-7B-Base-q5_1-ggjt.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Base-q5_1-ggjt.bin) | [togethercomputer/RedPajama-INCITE-7B-Base](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Base) | Q5_1 | GGJT | V3 |
|
39 |
+
| [RedPajama-INCITE-7B-Chat-f16.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Chat-f16.bin) | [togethercomputer/RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat) | F16 | GGML | V3 |
|
40 |
+
| [RedPajama-INCITE-7B-Chat-q4_0.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Chat-q4_0.bin) | [togethercomputer/RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat) | Q4_0 | GGML | V3 |
|
41 |
+
| [RedPajama-INCITE-7B-Chat-q4_0-ggjt.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Chat-q4_0-ggjt.bin) | [togethercomputer/RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat) | Q4_0 | GGJT | V3 |
|
42 |
+
| [RedPajama-INCITE-7B-Chat-q5_1-ggjt.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Chat-q5_1-ggjt.bin) | [togethercomputer/RedPajama-INCITE-7B-Chat](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Chat) | Q5_1 | GGJT | V3 |
|
43 |
+
| [RedPajama-INCITE-7B-Instruct-f16.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Instruct-f16.bin) | [togethercomputer/RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct) | F16 | GGML | V3 |
|
44 |
+
| [RedPajama-INCITE-7B-Instruct-q4_0.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Instruct-q4_0.bin) | [togethercomputer/RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct) | Q4_0 | GGML | V3 |
|
45 |
+
| [RedPajama-INCITE-7B-Instruct-q4_0-ggjt.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Instruct-q4_0-ggjt.bin) | [togethercomputer/RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct) | Q4_0 | GGJT | V3 |
|
46 |
+
| [RedPajama-INCITE-7B-Instruct-q5_1-ggjt.bin](https://huggingface.co/rustformers/redpajama-7b-ggml/blob/main/RedPajama-INCITE-7B-Instruct-q5_1-ggjt.bin) | [togethercomputer/RedPajama-INCITE-7B-Instruct](https://huggingface.co/togethercomputer/RedPajama-INCITE-7B-Instruct) | Q5_1 | GGJT | V3 |
|
47 |
+
|
48 |
+
## Usage
|
49 |
+
|
50 |
+
### Python via [llm-rs](https://github.com/LLukas22/llm-rs-python):
|
51 |
+
|
52 |
+
#### Installation
|
53 |
+
Via pip: `pip install llm-rs`
|
54 |
+
|
55 |
+
#### Run inference
|
56 |
+
```python
|
57 |
+
from llm_rs import AutoModel
|
58 |
+
|
59 |
+
#Load the model, define any model you like from the list above as the `model_file`
|
60 |
+
model = AutoModel.from_pretrained("rustformers/redpajama-7b-ggml",model_file="RedPajama-INCITE-7B-Base-q4_0-ggjt.bin")
|
61 |
+
|
62 |
+
#Generate
|
63 |
+
print(model.generate("The meaning of life is"))
|
64 |
+
```
|
65 |
+
### Using [local.ai](https://github.com/louisgv/local.ai) GUI
|
66 |
+
|
67 |
+
#### Installation
|
68 |
+
Download the installer at [www.localai.app](https://www.localai.app/).
|
69 |
+
|
70 |
+
#### Running Inference
|
71 |
+
Download your preferred model and place it in the "models" directory. Subsequently, you can start a chat session with your model directly from the interface.
|
72 |
+
|
73 |
+
### Rust via [Rustformers/llm](https://github.com/rustformers/llm):
|
74 |
+
|
75 |
+
#### Installation
|
76 |
+
```
|
77 |
+
git clone --recurse-submodules https://github.com/rustformers/llm.git
|
78 |
+
cd llm
|
79 |
+
cargo build --release
|
80 |
+
```
|
81 |
+
|
82 |
+
#### Run inference
|
83 |
+
```
|
84 |
+
cargo run --release -- gptneox infer -m path/to/model.bin -p "Tell me how cool the Rust programming language is:"
|
85 |
+
```
|