FunASR/docs/modelscope_pipeline/itn_pipeline.md
2023-05-05 16:29:05 +08:00

3.0 KiB
Raw Blame History

Inverse Text Normalization (ITN)

Note

: The modelscope pipeline supports all the models in model zoo to inference. Here we take the model of the Japanese ITN model as example to demonstrate the usage.

Inference

Quick start

Japanese ITN model

from modelscope.pipelines import pipeline
from modelscope.utils.constant import Tasks

itn_inference_pipline = pipeline(
    task=Tasks.inverse_text_processing,
    model='damo/speech_inverse_text_processing_fun-text-processing-itn-ja',
    model_revision=None)

itn_result = itn_inference_pipline(text_in='百二十三')
print(itn_result)
  • read text data directly.
rec_result = inference_pipeline(text_in='一九九九年に誕生した同商品にちなみ、約三十年前、二十四歳の頃の幸四郎の写真を公開。')
rec_result = inference_pipeline(text_in='https://isv-data.oss-cn-hangzhou.aliyuncs.com/ics/MaaS/ASR/test_text/ja_itn_example.txt')

Full code of demo, please ref to demo

Modify Your Own ITN Model

The rule-based ITN code is open-sourced in FunTextProcessing, users can modify by their own grammar rules. After modify the rules, the users can export their own ITN models in local directory.

Export ITN Model

Use the code in FunASR to export ITN model. An example to export ITN model to local folder is shown as below.

cd fun_text_processing/inverse_text_normalization/
python export_models.py --language ja --export_dir ./itn_models/

Evaluate ITN Model

Users can evaluate their own ITN model in local directory. Here is an example:

python fun_text_processing/inverse_text_normalization/inverse_normalize.py --input_file ja_itn_example.txt --cache_dir ./itn_models/ --output_file output.txt --language=ja

API-reference

Define pipeline

  • task: Tasks.inverse_text_processing
  • model: model name in model zoo, or model path in local disk
  • output_dir: None (Default), the output path of results if set
  • model_revision: None (Default), setting the model version

Infer pipeline