bleysg commited on
Commit
66fc9b6
1 Parent(s): bf74c17

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -0
README.md CHANGED
@@ -29,6 +29,8 @@ It is the same subset of our data as was used in our [OpenOrcaxOpenChat-Preview2
29
  This release provides a first: a fully open model with class-breaking performance, capable of running fully accelerated on even moderate consumer GPUs.
30
  Our thanks to the Mistral team for leading the way here.
31
 
 
 
32
  Want to visualize our full (pre-filtering) dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
33
 
34
  [<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
@@ -117,6 +119,14 @@ Commodity cost was ~$400.
117
  # Citation
118
 
119
  ```bibtex
 
 
 
 
 
 
 
 
120
  @misc{mukherjee2023orca,
121
  title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
122
  author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},
 
29
  This release provides a first: a fully open model with class-breaking performance, capable of running fully accelerated on even moderate consumer GPUs.
30
  Our thanks to the Mistral team for leading the way here.
31
 
32
+ We affectionately codename this model: "*MistralOrca*"
33
+
34
  Want to visualize our full (pre-filtering) dataset? Check out our [Nomic Atlas Map](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2).
35
 
36
  [<img src="https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B/resolve/main/OpenOrca%20Nomic%20Atlas.png" alt="Atlas Nomic Dataset Map" width="400" height="400" />](https://atlas.nomic.ai/map/c1b88b47-2d9b-47e0-9002-b80766792582/2560fd25-52fe-42f1-a58f-ff5eccc890d2)
 
119
  # Citation
120
 
121
  ```bibtex
122
+ @software{lian2023mistralorca1
123
+ title = {MistralOrca: Mistral-7B Model Instruct-tuned on Filtered OpenOrcaV1 GPT-4 Dataset},
124
+ author = {Wing Lian and Bleys Goodson and Guan Wang and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
125
+ year = {2023},
126
+ publisher = {HuggingFace},
127
+ journal = {HuggingFace repository},
128
+ howpublished = {\url{https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca},
129
+ }
130
  @misc{mukherjee2023orca,
131
  title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4},
132
  author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah},