Edit model card

hongyin/chat-mistral-7b-80k

I am pleased to introduce an English-Chinese conversation assistant designed to reduce the cost of inference. It is trained based on the Mistral-7B-Instruct, with a unique vocabulary and 7 billion parameters.

Losing fat is the only way to solve all problems.


Bibtex entry and citation info

Please cite if you find it helpful.

@article{zhu2023metaaid,
  title={MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models},
  author={Zhu, Hongyin},
  journal={arXiv preprint arXiv:2302.13173},
  year={2023}
}

license: other

Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.