ONNX
English

Spaces Demo

Trained with Matcha-TTS(Not my work,I just converted to onnx) - Github | Paper

How to Infer see Github page

License

You have to follow the cc-by-4.0 vctk license.

Datasets License

  • VCTK Dataset license are cc-by-4.0

Tools License

These tools did not effect output license.

  • Matcha-TTS - MIT
  • ONNX Simplifier - Apache2.0
  • onnxruntime - MIT

Converted model Owner(me)

I release my output under MIT License.If you want your license ,convert it by yourself

Onnx File Type

All models are simplify(If you need original,export by yourself)

Vocoder:hifigan_univ_v1(some english speaker avoid robotic)

  • vctk_univ_simplify.onnx
  • vctk_univ_simplify_q8.onnx - Quantized Github page friendly small size ,but 3-5 times slow

Vocoder:hifigan_T2_v1(Good for English)

  • vctk_t2_simplify.onnx
  • vctk_t2_simplify_q8.onnx - Quantized Github page friendly small size ,but 3-5 times slow

How to Convert

Export Model

see Matcha-TTS ONNX export

python -m matcha.onnx.export matcha_vctk.ckpt vctk_t2.onnx --vocoder-name "hifigan_T2_v1" --vocoder-checkpoint "generator_v1"

simplify model

from onnxsim import simplify
import onnx

import argparse
parser = argparse.ArgumentParser(
        description="create simplify onnx"
    )
parser.add_argument(
        "--input","-i",
        type=str,required=True
    )
parser.add_argument(
        "--output","-o",
        type=str
    )
args = parser.parse_args()

src_model_path = args.input
if args.output == None:
    dst_model_path = src_model_path.replace(".onnx","_simplify.onnx")
else:
    dst_model_path = args.output


model = onnx.load(src_model_path)
model_simp, check = simplify(model)

onnx.save(model_simp, dst_model_path)

quantize model

from onnxruntime.quantization import quantize_dynamic, QuantType
import argparse
parser = argparse.ArgumentParser(
        description="create quantized onnx"
    )
parser.add_argument(
        "--input","-i",
        type=str,required=True
    )
parser.add_argument(
        "--output","-o",
        type=str
    )
args = parser.parse_args()

src_model_path = args.input
if args.output == None:
    dst_model_path = src_model_path.replace(".onnx","_q8.onnx")
else:
    dst_model_path = args.output
    
# only QUInt8 works well
quantized_model = quantize_dynamic(src_model_path, dst_model_path, weight_type=QuantType.QUInt8)
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Dataset used to train Akjava/matcha-tts_vctk-onnx

Spaces using Akjava/matcha-tts_vctk-onnx 4