OmniCorso-7B / README.md
macadeliccc's picture
Update README.md
ea9d4e9 verified
|
raw
history blame
800 Bytes
metadata
license: cc-by-nc-sa-4.0

NeuralCorso-7B

image/webp

This model is a merge of macadeliccc/MBX-7B-v3-DPO and mlabonne/OmniNeuralBeagle-7B

Code Example

from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("macadeliccc/NeuralCorso-7B")
model = AutoModelForCausalLM.from_pretrained("macadeliccc/NeuralCorso-7B")

messages = [
    {"role": "system", "content": "Respond to the users request like a pirate"},
    {"role": "user", "content": "Can you write me a quicksort algorithm?"}
]
gen_input = tokenizer.apply_chat_template(messages, return_tensors="pt")