Streaming automatic speech recognition (ASR) aims to emit each hypothesized word as quickly and accurately as possible. However, emitting fast without degrading quality, as measured by word error rate (WER), is highly challenging. Existing approaches including Early and Late Penalties and Constrained Alignments penalize emission delay by manipulating per-token or per-frame probability prediction in sequence transducer models. While being successful in reducing delay, these approaches suffer from significant accuracy regression and also require additional word alignment information from an existing model. In this work, we propose a sequence-level emission regularization method, named FastEmit, that applies latency regularization directly on per-sequence probability in training transducer models, and does not require any alignment. We demonstrate that FastEmit is more suitable to the sequence-level optimization of transducer models for streaming ASR by applying it on various end-to-end streaming ASR networks including RNN-Transducer, Transformer-Transducer, ConvNet-Transducer and Conformer-Transducer. We achieve 150-300 ms latency reduction with significantly better accuracy over previous techniques on a Voice Search test set. FastEmit also improves streaming ASR accuracy from 4.4%/8.9% to 3.1%/7.5% WER, meanwhile reduces 90th percentile latency from 210 ms to only 30 ms on LibriSpeech.


翻译:发送自动语音识别(ASR)的目的是尽可能快速和准确地发布每个虚伪的单词。 然而,以字差错率(WER)测量的快速且不降低质量的快速排放,是极具挑战性的。 现有的方法,包括早晚处罚和约束性调整等,通过在序列传输器模型中操纵对单吨或每框架概率的预测来惩罚排放延迟。 这些方法在成功地减少延迟的同时,还受到显著的准确性回归的影响,并要求从现有模型中获得更多的单词对齐信息。 在这项工作中,我们提出了一个序列级排放规范化方法,名为Fast Emit,在培训导师模型中直接对单序列概率进行定置,且不要求任何对齐。 我们证明,快速 Emit 更适合导器模型的序列级优化,在包括 RNNE- Torninger、变换器-变换器、CONNet- Transfer and Contraction-traductioner 方法中,我们从每序列概率概率调整150-300 mass ER-Rass Rass sal 30 smass smass smass smass smass smass smass smass arass setty settillation 。

0
下载
关闭预览

相关内容

【MIT深度学习课程】深度序列建模,Deep Sequence Modeling
专知会员服务
78+阅读 · 2020年2月3日
机器学习入门的经验与建议
专知会员服务
94+阅读 · 2019年10月10日
Hierarchically Structured Meta-learning
CreateAMind
27+阅读 · 2019年5月22日
AutoML与轻量模型大列表
专知
8+阅读 · 2019年4月29日
A Technical Overview of AI & ML in 2018 & Trends for 2019
待字闺中
18+阅读 · 2018年12月24日
【学习】Hierarchical Softmax
机器学习研究会
4+阅读 · 2017年8月6日
VIP会员
相关VIP内容
相关资讯
Top
微信扫码咨询专知VIP会员