After the initial experiment with chronoboros-33B it was evident that the merge was to unpredictable to be useful, testing the individual models it became clear that the bias should be weighted towards Chronos. This is the new release of the merge with 75% chronos 33B, and 25% airoboros-1.4 33B.

Model has been tested with the Alpaca prompting format combined with KoboldAI Lite's instruct and chat modes, as well as regular story writing. It has also been tested on basic reasoning tasks, but has not seen much testing for factual information.

Downloads last month
1,214
Safetensors
Model size
32.5B params
Tensor type
F32
Β·
FP16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Henk717/airochronos-33B

Quantizations
5 models

Spaces using Henk717/airochronos-33B 21