--- base_model: - deepseek-ai/DeepSeek-R1-Distill-Llama-8B --- ```python import transformers import torch model_id = "miike-ai/r1-12b" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto", ) messages = [ {"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"}, {"role": "user", "content": "Who are you?"}, ] outputs = pipeline( messages, max_new_tokens=8192, ) print(outputs[0]["generated_text"][-1]) ```