Edit model card

This model was the result of a 50/50 average weight merge between Airoboros-33B-1.4 and Chronos-33B.

After prolonged testing we concluded that while this merge is highly flexible and capable of many different tasks, it has to much variation in how it answers to be reliable. Because of this the model relies on some luck to get good results, and is therefore not recommended to people seeking a consistent experience, or people sensitive to anticipation based addictions.

If you would like an improved version of this model that is more stable check out my Airochronos-33B merge.

Downloads last month
644
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Henk717/chronoboros-33B

Quantizations
4 models

Spaces using Henk717/chronoboros-33B 24