zenz-v1 Checkpoints

zenz-v1 is a language model specialized for kana-kanji conversion tasks based on the GPT-2 architecture. It is intended for use in the neural kana-kanji conversion system "Zenzai."

This repository publishes the checkpoints for zenz-v1.

  • 90M parameters
  • Character-level + byte-level BPE tokenizer
  • High performance in kana-kanji conversion tasks using greedy decoding

Model Details

Model Description

The base model used is ku-nlp/gpt2-small-japanese-char provided under CC-BY-SA 4.0.

This model is provided under CC-BY-SA 4.0.

Model Sources

This model is intended for use with Zenzai (AzooKeyKanaKanjiConverter).

Acknowledgements

The following libraries, tools, and language resources were utilized in constructing this model.

Downloads last month
52
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Spaces using Miwa-Keita/zenz-v1-checkpoints 2