사용법

1. 기본 모델

from transformers import AutoModelForSeq2SeqLM,AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained(
    "wisenut-nlp-team/KoT5", 
    use_auth_token=<개인 읽기전용 토큰>
)
model = AutoModelForSeq2SeqLM.from_pretrained(
    "wisenut-nlp-team/KoT5", 
    use_auth_token=<개인 읽기전용 토큰>
)

2. 조정학습 모델

요약(summarization)

from transformers import AutoModelForSeq2SeqLM,AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained(
    "wisenut-nlp-team/KoT5", 
    revision="summarization",
    use_auth_token=<개인 읽기전용 토큰>
)
model = AutoModelForSeq2SeqLM.from_pretrained(
    "wisenut-nlp-team/KoT5", 
    revision="summarization",
    use_auth_token=<개인 읽기전용 토큰>
)

바꿔쓰기(paraphrase generation)

from transformers import AutoModelForSeq2SeqLM,AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained(
    "wisenut-nlp-team/KoT5", 
    revision="paraphrase",
    use_auth_token=<개인 읽기전용 토큰>
)
model = AutoModelForSeq2SeqLM.from_pretrained(
    "wisenut-nlp-team/KoT5", 
    revision="paraphrase",
    use_auth_token=<개인 읽기전용 토큰>
)
Downloads last month
186
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.