Edit model card

zenz-v1 Checkpoints

zenz-v1 is a language model specialized for kana-kanji conversion tasks based on the GPT-2 architecture. It is intended for use in the neural kana-kanji conversion system "Zenzai."

This repository publishes the checkpoints for zenz-v1.

  • 90M parameters
  • Character-level + byte-level BPE tokenizer
  • High performance in kana-kanji conversion tasks using greedy decoding

Model Details

Model Description

The base model used is ku-nlp/gpt2-small-japanese-char provided under CC-BY-SA 4.0.

This model is provided under CC-BY-SA 4.0.

Model Sources

This model is intended for use with Zenzai (AzooKeyKanaKanjiConverter).

Acknowledgements

The following libraries, tools, and language resources were utilized in constructing this model.

Downloads last month
85
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using Miwa-Keita/zenz-v1-checkpoints 2