YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Training details

  • Dataset used: Explanation style datasets from psmathur/WizardLM_Orca and Dahoas/cot_gsm8k
  • Techniques: fp16 bit precision training + LoRA + DeepSpeed
  • Machine: V100 (16GB) * 2

Inference


from peft import PeftModel
from huggingface_hub import hf_hub_download
from transformers import GPTNeoForCausalLM, AutoTokenizer
import json

model_name = "shahules786/Redpajama-3B-orcastyle"
config = hf_hub_download(repo_id=model_name, filename="adapter_config.json", local_dir=".")
config =  json.load(open("adapter_config.json"))
base_model = config["base_model_name_or_path"]
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = GPTNeoForCausalLM.from_pretrained(base_model)
model.resize_token_embeddings(len(self.tokenizer))
model = PeftModel.from_pretrained(model, model_name).eval()
tokenizer.padding_side = "left"

inputs = tokenizer("This is a sample run", return_tensors="pt")
model.generate(**inputs)

Checkout training and inference code here

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .