Text Generation

😈 Imp

Introduction

To fit the MLC framework for mobile devices, we further perform the 4-bit quantization to Imp-v1.5-3B-196 to obtain Imp-v1.5-3B-196-q4f16_1-MLC.

To use this model on moblie devices, please refer to the mlc-imp project.

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Citation

If you use our model or refer our work in your studies, please cite:

@article{imp2024,
  title={Imp: Highly Capable Large Multimodal Models for Mobile Devices},
  author={Shao, Zhenwei and Yu, Zhou and Yu, Jun and Ouyang, Xuecheng and Zheng, Lihao and Gai, Zhenbiao and Wang, Mingyang and Ding, Jiajun},
  journal={arXiv preprint arXiv:2405.12107},
  year={2024}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Datasets used to train MILVLG/Imp-v1.5-3B-196-q4f16_1-MLC

Collection including MILVLG/Imp-v1.5-3B-196-q4f16_1-MLC