Update README.md
Browse files
README.md
CHANGED
@@ -129,7 +129,7 @@ Following Mitchell et al. (2018), we provide a model card for GPT-SW3.
|
|
129 |
- Model type: GPT-SW3 is a large decoder-only transformer language model.
|
130 |
- Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation.
|
131 |
- Paper or other resource for more information: N/A.
|
132 |
-
- License: [LICENSE](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct/
|
133 |
- Where to send questions or comments about the model: nlu@ai.se
|
134 |
|
135 |
# Intended Use
|
|
|
129 |
- Model type: GPT-SW3 is a large decoder-only transformer language model.
|
130 |
- Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation.
|
131 |
- Paper or other resource for more information: N/A.
|
132 |
+
- License: [LICENSE](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct/blob/main/LICENSE).
|
133 |
- Where to send questions or comments about the model: nlu@ai.se
|
134 |
|
135 |
# Intended Use
|