YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
KoGPT2-Transformers
KoGPT2 on Huggingface Transformers
KoGPT2-Transformers
SKT-AI μμ 곡κ°ν KoGPT2 (ver 1.0)λ₯Ό Transformersμμ μ¬μ©νλλ‘ νμμ΅λλ€.
SKT-AI μμ KoGPT2 2.0μ 곡κ°νμμ΅λλ€. https://huggingface.co/skt/kogpt2-base-v2/
Demo
- μΌμ λν μ±λ΄ : http://demo.tmkor.com:36200/dialo
- νμ₯ν 리뷰 μμ± : http://demo.tmkor.com:36200/ctrl
Example
from transformers import GPT2LMHeadModel, PreTrainedTokenizerFast
model = GPT2LMHeadModel.from_pretrained("taeminlee/kogpt2")
tokenizer = PreTrainedTokenizerFast.from_pretrained("taeminlee/kogpt2")
input_ids = tokenizer.encode("μλ
", add_special_tokens=False, return_tensors="pt")
output_sequences = model.generate(input_ids=input_ids, do_sample=True, max_length=100, num_return_sequences=3)
for generated_sequence in output_sequences:
generated_sequence = generated_sequence.tolist()
print("GENERATED SEQUENCE : {0}".format(tokenizer.decode(generated_sequence, clean_up_tokenization_spaces=True)))
- Downloads last month
- 596
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.