Composite Qwen2.5-0.5B Model
This is a composite model created by combining layers from different Qwen2.5-0.5B variants.
Usage
from transformers import AutoConfig, AutoModelForCausalLM, AutoTokenizer
config = AutoConfig.from_pretrained("ant031525-01")
model = AutoModelForCausalLM.from_pretrained("ant031525-01")
tokenizer = AutoTokenizer.from_pretrained("ant031525-01")
Base Models
This model is comprised of layers from the following models:
- Qwen/Qwen2.5-0.5B
- Qwen/Qwen2.5-0.5B-Instruct
- unsloth/Qwen2.5-0.5B
- cognitivecomputations/Dolphin3.0-Qwen2.5-0.5B
- artificialguybr/Qwen2.5-0.5B-OpenHermes2.5
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Evaluation results
- perplexity on Composite Model Evaluationself-reportedN/A
- accuracy on Composite Model Evaluationself-reportedN/A