metadata
license: apache-2.0
tags:
- moe
- merge
- mergekit
- lazymergekit
- phi3_mergekit
- microsoft/Phi-3-medium-128k-instruct
base_model:
- microsoft/Phi-3-medium-128k-instruct
- microsoft/Phi-3-medium-128k-instruct
Phi3Mix
Phi3Mix is a Mixture of Experts (MoE) made with the following models using Phi3_LazyMergekit:
🧩 Configuration
base_model: microsoft/Phi-3-medium-128k-instruct
gate_mode: cheap_embed
experts_per_token: 1
dtype: float16
experts:
- source_model: microsoft/Phi-3-medium-128k-instruct
positive_prompts: ["research, logic, math, science"]
- source_model: microsoft/Phi-3-medium-128k-instruct
positive_prompts: ["creative, art"]
💻 Usage
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model = "DarqueDante/Phi3Mix"
tokenizer = AutoTokenizer.from_pretrained(model)
model = AutoModelForCausalLM.from_pretrained(
model,
trust_remote_code=True,
)
prompt="How many continents are there?"
input = f"<|system|>You are a helpful AI assistant.<|end|><|user|>{prompt}<|assistant|>"
tokenized_input = tokenizer.encode(input, return_tensors="pt")
outputs = model.generate(tokenized_input, max_new_tokens=128, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(tokenizer.decode(outputs[0]))