An Efficient Temporal Model for Small-Footprint Keyword Spotting

Shuo Zhang, Tianhao Zhang, Songlu Chen, Feng Chen, Xu-Cheng Yin
{"title":"An Efficient Temporal Model for Small-Footprint Keyword Spotting","authors":"Shuo Zhang, Tianhao Zhang, Songlu Chen, Feng Chen, Xu-Cheng Yin","doi":"10.1109/IC-NIDC54101.2021.9660544","DOIUrl":null,"url":null,"abstract":"Keyword spotting (KWS), as an essential part of human-computer interaction, is widely used in mobile device terminals. However, the hardware resources of these devices are usually limited, so running on these devices requires a small memory footprint. However, previous works still need massive parameters to achieve high performance. In this work, we propose a context-dependent and compact network for small-footprint KWS. Firstly, to reduce the running time, we apply a sub-sampling technique in which hidden activation values are calculated in a few time steps based on time delay neural network (TDNN). Secondly, to take full advantage of the global context information of the feature maps, we utilize a squeeze-and-excitation block to emphasize the most discriminating area and distinguish the speech and non-speech regions. Finally, we conduct extensive experiments with the publicly available Google Speech Commands dataset and the private Biaobei Chinese Speech Commands dataset. The experimental results on the public dataset verify that the classification error rate of our method reaches 3.56% with only 11K parameters and 322K multiplications, which achieves state-of-the-art performance with the fewest parameters and multiplications.","PeriodicalId":264468,"journal":{"name":"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)","volume":"2016 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC-NIDC54101.2021.9660544","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Keyword spotting (KWS), as an essential part of human-computer interaction, is widely used in mobile device terminals. However, the hardware resources of these devices are usually limited, so running on these devices requires a small memory footprint. However, previous works still need massive parameters to achieve high performance. In this work, we propose a context-dependent and compact network for small-footprint KWS. Firstly, to reduce the running time, we apply a sub-sampling technique in which hidden activation values are calculated in a few time steps based on time delay neural network (TDNN). Secondly, to take full advantage of the global context information of the feature maps, we utilize a squeeze-and-excitation block to emphasize the most discriminating area and distinguish the speech and non-speech regions. Finally, we conduct extensive experiments with the publicly available Google Speech Commands dataset and the private Biaobei Chinese Speech Commands dataset. The experimental results on the public dataset verify that the classification error rate of our method reaches 3.56% with only 11K parameters and 322K multiplications, which achieves state-of-the-art performance with the fewest parameters and multiplications.
一个有效的时间模型用于小足迹关键字识别
关键词识别作为人机交互的重要组成部分,在移动设备终端中得到了广泛的应用。然而,这些设备的硬件资源通常是有限的,因此在这些设备上运行只需要很小的内存占用。然而,以前的工作仍然需要大量的参数来实现高性能。在这项工作中,我们提出了一个基于上下文的紧凑网络,用于小足迹的KWS。首先,为了减少运行时间,我们采用了一种基于时间延迟神经网络(TDNN)的子采样技术,该技术在几个时间步内计算隐藏激活值。其次,为了充分利用特征图的全局上下文信息,我们利用压缩激励块来强调最具区别性的区域,区分语音和非语音区域;最后,我们对公开可用的谷歌语音命令数据集和专用的标北中文语音命令数据集进行了广泛的实验。在公共数据集上的实验结果表明,我们的方法在仅11K个参数和322K次乘法的情况下,分类错误率达到了3.56%,以最少的参数和乘法达到了最先进的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信