metadata
library_name: transformers
tags:
- pytorch
- llama3
license: llama3
datasets:
- ruslanmv/ai-medical-chatbot
language:
- en
metrics:
- accuracy
Model Card for Model ID
Model Details
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: ManhTien(Money)
- Funded by: [More Information Needed]
- Shared by : [More Information Needed]
- Model type: natural language model - llm3
- Language(s) (NLP): [More Information Needed]
- License: [More Information Needed]
- Finetuned from model: meta-llama/Meta-Llama-3.1-8B-Instruct
Usage
Here is an example of how to use the tien007/llama-3-8b-sft
model with Hugging Face's Transformers library:
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load the model and tokenizer
model_id = "tien007/llama-3-8b-sft"
model = AutoModelForCausalLM.from_pretrained(model_id, ignore_mismatched_sizes=True, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_id)
# Define input messages
messages = [{ "role": "user", "content": "How to effectively treat cancer?" }]
# Prepare the prompt
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
# Tokenize input and move to GPU
inputs = tokenizer(prompt, return_tensors='pt', padding=True, truncation=True).to("cuda")
# Generate a response
outputs = model.generate(**inputs, max_length=150, num_return_sequences=1)
# Decode the response
text = tokenizer.decode(outputs[0], skip_special_tokens=True)
# Print the response
print(text.split("assistant")[1])
Recommendations
Results
Summary
Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).
- Hardware Type: [More Information Needed]
- Hours used: [More Information Needed]
- Cloud Provider: [More Information Needed]
- Compute Region: [More Information Needed]
- Carbon Emitted: [More Information Needed]