Model Description
Erya4FT is based on Erya and further fine-tuned on our Dataset, enhancing the ability to translate ancient Chinese into Modern Chinese.
Example
>>> from transformers import BertTokenizer, CPTForConditionalGeneration
>>> tokenizer = BertTokenizer.from_pretrained("RUCAIBox/Erya4FT")
>>> model = CPTForConditionalGeneration.from_pretrained("RUCAIBox/Erya4FT")
>>> input_ids = tokenizer("竖子不足与谋。", return_tensors='pt')
>>> input_ids.pop("token_type_ids")
>>> pred_ids = model.generate(max_new_tokens=256, **input_ids)
>>> print(tokenizer.batch_decode(pred_ids, skip_special_tokens=True))
['这 小 子 不 值 得 与 他 商 量 。']
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.