Edit model card

Model Info

GPT2 Trained on wikimedia/wikipedia 20231101.en

No eval

context_length = 128
per_device_train_batch_size=32,
logging_steps=5_000,
gradient_accumulation_steps=8,
num_train_epochs=1,
weight_decay=0.1,
warmup_steps=1_000,
lr_scheduler_type="cosine",
learning_rate=5e-4,
save_steps=5_000,
fp16=True,

How to use

from transformers import GPT2LMHeadModel, AutoTokenizer

model = GPT2LMHeadModel.from_pretrained("J4bb4wukis/gpt2_wikipedia_en")
tokenizer = AutoTokenizer.from_pretrained("J4bb4wukis/gpt2_wikipedia_en")

prompts = "In 1995 a police officer "
inputs = tokenizer(prompts,return_tensors='pt').input_ids
outputs = model.generate(inputs, max_new_tokens=100, do_sample=True, top_k=10, top_p=0.95)
print(tokenizer.batch_decode(outputs, skip_special_tokens=True))
Downloads last month
0
Safetensors
Model size
124M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Dataset used to train J4bb4wukis/gpt2_wikipedia_en