--- # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1 # Doc / guide: https://huggingface.co/docs/hub/model-cards {} --- # Model Card for Extended-Mind-MPT-7b-Chat Extended Mind MPT-7b-chat, as described in [Supersizing Transformers](https://blog.normalcomputing.ai/posts/2023-09-12-supersizing-transformers/supersizing-transformers.html). ### Model Description This model implements active externalism for MPT's 7b chat model. The model weights have not been edited. Original architecture and code by Mosaic ML. For more details on active externalism, check out our [blog](https://blog.normalcomputing.ai/posts/2023-09-12-supersizing-transformers/supersizing-transformers.html)! - **Developed by:** [Normal Computing](https://huggingface.co/normalcomputing), Adapted from [Mosacic ML](https://huggingface.co/mosaicml) - **License:** Apache 2.0 ## Limitations This model is part of ongoing research at Normal Computing.