MoCLE Model Card

MoCLE is a Multi-modality Large Language Model (MLLM) with a Mixture-of-Experts (MoE) architecture for instruction customization and generalization based on InstructBLIP. This repo contains the MoCLE checkpoint with 64 instruction clusters and a routing temperature of 0.1. Check detailed usage in our Github repo and Website.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Collection including KaiChen1998/mocle-c64-t01