Text Generation
Transformers
PyTorch
Safetensors
gpt2
conversational
text-generation-inference
Inference Endpoints
Ekgren commited on
Commit
4530a58
1 Parent(s): 3f722f5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -129,7 +129,7 @@ Following Mitchell et al. (2018), we provide a model card for GPT-SW3.
129
  - Model type: GPT-SW3 is a large decoder-only transformer language model.
130
  - Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation.
131
  - Paper or other resource for more information: N/A.
132
- - License: [LICENSE](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct/edit/main/LICENSE).
133
  - Where to send questions or comments about the model: nlu@ai.se
134
 
135
  # Intended Use
 
129
  - Model type: GPT-SW3 is a large decoder-only transformer language model.
130
  - Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: GPT-SW3 was trained with the NeMo Megatron GPT implementation.
131
  - Paper or other resource for more information: N/A.
132
+ - License: [LICENSE](https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct/blob/main/LICENSE).
133
  - Where to send questions or comments about the model: nlu@ai.se
134
 
135
  # Intended Use