Edit model card

well, nothing. Just a test model, maybe good. you can use GGUF from my friend, he do many quant :)

https://huggingface.co/mradermacher/Inixion-2x8B-GGUF

thank for AWQ version, I didn't know that my model have AWQ version too:

https://huggingface.co/solidrust/Inixion-2x8B-AWQ

Downloads last month
8
Safetensors
Model size
13.7B params
Tensor type
BF16
·
Inference API
Input a message to start chatting with Alsebay/Inixion-2x8B.
Model is too large to load in Inference API (serverless). To try the model, launch it on Inference Endpoints (dedicated) instead.