diff --git a/docs/benchmark/benchmark_pipeline_cer.md b/docs/benchmark/benchmark_pipeline_cer.md new file mode 100644 index 000000000..9f42c9533 --- /dev/null +++ b/docs/benchmark/benchmark_pipeline_cer.md @@ -0,0 +1,203 @@ +# Benchmark (ModeScope Pipeline) + + +## Configuration +### Data set: +[Aishell1](https://www.openslr.org/33/): dev, test + +[Aishell2](https://www.aishelltech.com/aishell_2): dev_ios, test_ios, test_android, test_mic + +[WenetSpeech](https://github.com/wenet-e2e/WenetSpeech): dev, test_meeting, test_net + + +### Tools +#### [Install Requirements](https://alibaba-damo-academy.github.io/FunASR/en/installation/installation.html#installation) + +Install ModelScope and FunASR from pip +```shell +pip install -U modelscope funasr +# For the users in China, you could install with the command: +#pip install -U funasr -i https://mirror.sjtu.edu.cn/pypi/web/simple +``` + +Or install FunASR from source code +```shell +git clone https://github.com/alibaba/FunASR.git && cd FunASR +pip install -e ./ +# For the users in China, you could install with the command: +# pip install -e ./ -i https://mirror.sjtu.edu.cn/pypi/web/simple +``` + + +#### Recipe + + +##### [Test CER](https://alibaba-damo-academy.github.io/FunASR/en/modelscope_pipeline/asr_pipeline.html#inference-with-multi-thread-cpus-or-multi-gpus) +set the `model`, `data_dir` and `output_dir` in `infer.sh`. +```shell +cd egs_modelscope/asr/TEMPLATE +bash infer.sh +``` + +## Benchmark CER + + +### Chinese Dataset + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
ModelOffline/OnlineAishell1Aishell2WenetSpeech
devtestdev_iostest_iostest_androidtest_micdevtest_meetingtest_net
Paraformer-large Offline1.761.942.792.843.083.033.437.016.66
Paraformer-large-long Offline1.802.102.782.873.123.113.4413.287.08
Paraformer-large-contextual Offline1.762.022.732.852.982.953.427.166.72
Paraformer Offline3.243.694.584.634.834.714.198.329.19
UniASR Online3.343.994.624.524.774.734.5110.639.70
UniASR-large Offline2.933.483.953.874.114.114.1610.098.69
Paraformer-aishell Offline4.885.43-------
ParaformerBert-aishell Offline6.147.01-------
Paraformer-aishell2 Offline--5.826.306.605.83---
ParaformerBert-aishell2 Offline--4.955.455.595.83---
+ + +### English Dataset +