![](https://cdn-avatars.huggingface.co/v1/production/uploads/1676334017493-63ead3a0d0b894bbc77b199c.png)
mosaicml/mpt-7b-8k
Text Generation
•
Updated
•
2.39k
•
26
The MPT collections is a series of decoder-style transformer models trained from scratch by MosaicML. Details: https://www.mosaicml.com/mpt