FunASR/runtime/tools/fst/train_lms.sh
Yabin Li 702ec03ad8
Dev new (#1065)
* add hotword for deploy_tools

* Support wfst decoder and contextual biasing (#1039)

* Support wfst decoder and contextual biasing

* Turn on fstbin compilation

---------

Co-authored-by: gongbo.gb <gongbo.gb@alibaba-inc.com>

* mv funasr/runtime runtime

* Fix crash caused by OOV in hotwords list

* funasr infer

* funasr infer

* funasr infer

* funasr infer

* funasr infer

* fix some bugs about fst hotword; support wfst for websocket server and clients; mv runtime out of funasr; modify relative docs

* del onnxruntime/include/gflags

* update tensor.h

* update run_server.sh

* update deploy tools

* update deploy tools

* update websocket-server

* update funasr-wss-server

* Remove self loop propagation

* Update websocket_protocol_zh.md

* Update websocket_protocol_zh.md

* update hotword protocol

* author zhaomingwork: change hotwords for h5 and java

* update hotword protocol

* catch exception for json_fst_hws

* update hotword on message

* update onnx benchmark for ngram&hotword

* update docs

* update funasr-wss-serve

* add NONE for LM_DIR

* update docs

* update run_server.sh

* add whats-new

* modify whats-new

* update whats-new

* update whats-new

* Support decoder option for beam searching

* update benchmark_onnx_cpp

* Support decoder option for websocket

* fix bug of CompileHotwordEmbedding

* update html client

* update docs

---------

Co-authored-by: gongbo.gb <35997837+aibulamusi@users.noreply.github.com>
Co-authored-by: gongbo.gb <gongbo.gb@alibaba-inc.com>
Co-authored-by: 游雁 <zhifu.gzf@alibaba-inc.com>
2023-11-07 18:34:29 +08:00

26 lines
817 B
Bash
Executable File

#!/bin/bash
## Make sure that srilm is installed
dir=lm
mkdir -p $dir
[ -f path.sh ] && . ./path.sh
# Prepare data, the format of the text should be:
# BAC009S0002W0122 而 对 楼市 成交 抑制 作用 最 大 的 限 购
# BAC009S0002W0123 也 成为 地方 政府 的 眼中 钉
corpus=lm/text
# generate lm dict
cat $corpus | awk '{for(n=2;n<=NF;n++) print tolower($n); }' | \
cat - <(echo "<unk>";echo "<s>"; echo "</s>") | \
sort | uniq -c | sort -nr | awk '{print $2}' > $dir/corpus.dict || exit 1;
# train ngram
cat $corpus | awk '{for(n=2;n<=NF;n++){ printf tolower($n); if(n<NF) printf " "; else print ""; }}' > $dir/train
ngram-count -text $dir/train -order 4 -limit-vocab -vocab $dir/corpus.dict -unk \
-kndiscount -interpolate -gt1min 1 -gt2min 1 -gt3min 2 -gt4min 2 -lm $dir/lm.arpa