phoebeklett's picture
Update README.md
4982797
|
raw
history blame
1.28 kB
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
{}
---
# Model Card for Extended-Mind-MPT-7b-Chat
<!-- Provide a quick summary of what the model is/does. -->
Extended Mind MPT-7b-chat, as described in [Supersizing Transformers](https://blog.normalcomputing.ai/posts/2023-09-12-supersizing-transformers/supersizing-transformers.html).
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model implements active externalism for MPT's 7b chat model. The model weights have not been edited. Original architecture and code by Mosaic ML.
For more details on active externalism, check out our [blog](https://blog.normalcomputing.ai/posts/2023-09-12-supersizing-transformers/supersizing-transformers.html)!
- **Developed by:** [Normal Computing](https://huggingface.co/normalcomputing), Adapted from [Mosacic ML](https://huggingface.co/mosaicml)
- **License:** Apache 2.0
## Limitations
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
This model is part of ongoing research at Normal Computing.