Update README.md
Browse files
README.md
CHANGED
@@ -38,7 +38,7 @@ Janus-DPO-7B is a model created by applying DPO to Janus using the [Multifaceted
|
|
38 |
- **Language(s) (NLP):** English
|
39 |
- **License:** Apache 2.0
|
40 |
- **Related Models:** [Janus-7B](https://huggingface.co/kaist-ai/janus-7b), [Janus-ORPO-7B](https://huggingface.co/kaist-ai/janus-orpo-7b), [Janus-RM-7B](https://huggingface.co/kaist-ai/janus-rm-7b)
|
41 |
-
- **Training Datasets**: [Multifaceted-Collection-
|
42 |
- **Resources for more information:**
|
43 |
- [Research paper](https://arxiv.org/abs/2405.17977)
|
44 |
- [GitHub Repo](https://github.com/kaistAI/Janus)
|
|
|
38 |
- **Language(s) (NLP):** English
|
39 |
- **License:** Apache 2.0
|
40 |
- **Related Models:** [Janus-7B](https://huggingface.co/kaist-ai/janus-7b), [Janus-ORPO-7B](https://huggingface.co/kaist-ai/janus-orpo-7b), [Janus-RM-7B](https://huggingface.co/kaist-ai/janus-rm-7b)
|
41 |
+
- **Training Datasets**: [Multifaceted-Collection-DPO](https://huggingface.co/datasets/kaist-ai/Multifaceted-Collection-DPO)
|
42 |
- **Resources for more information:**
|
43 |
- [Research paper](https://arxiv.org/abs/2405.17977)
|
44 |
- [GitHub Repo](https://github.com/kaistAI/Janus)
|