{"title":"基于线性预测编码的野生动物探测系统声学分类","authors":"L. Grama, Elena Roxana Buhus, C. Rusu","doi":"10.1109/ISSCS.2017.8034944","DOIUrl":null,"url":null,"abstract":"In this work we compare different classification algorithms applied on different number of features (linear predictive coding coefficients) in order to detect audio signals from wildlife areas. The final goal is to find the appropriate number of linear predictive coding coefficients to provide the desired accuracy for a certain framework. The experimental results prove that the best classifier is Logistic Model Trees regardless the number of features, having a constant classification accuracy greater than 95%. In the case of a reduced number of features, both Random Forest and Lazy IBk have good results; the classification accuracy is greater than 98%.","PeriodicalId":338255,"journal":{"name":"2017 International Symposium on Signals, Circuits and Systems (ISSCS)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Acoustic classification using linear predictive coding for wildlife detection systems\",\"authors\":\"L. Grama, Elena Roxana Buhus, C. Rusu\",\"doi\":\"10.1109/ISSCS.2017.8034944\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this work we compare different classification algorithms applied on different number of features (linear predictive coding coefficients) in order to detect audio signals from wildlife areas. The final goal is to find the appropriate number of linear predictive coding coefficients to provide the desired accuracy for a certain framework. The experimental results prove that the best classifier is Logistic Model Trees regardless the number of features, having a constant classification accuracy greater than 95%. In the case of a reduced number of features, both Random Forest and Lazy IBk have good results; the classification accuracy is greater than 98%.\",\"PeriodicalId\":338255,\"journal\":{\"name\":\"2017 International Symposium on Signals, Circuits and Systems (ISSCS)\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 International Symposium on Signals, Circuits and Systems (ISSCS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISSCS.2017.8034944\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 International Symposium on Signals, Circuits and Systems (ISSCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSCS.2017.8034944","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10
摘要
在这项工作中,我们比较了应用于不同数量的特征(线性预测编码系数)的不同分类算法,以检测来自野生动物区域的音频信号。最终目标是找到适当数量的线性预测编码系数,为特定框架提供所需的精度。实验结果证明,无论特征个数多少,最佳分类器都是Logistic模型树(Logistic Model Trees),分类准确率恒定在95%以上。在特征数量减少的情况下,随机森林和懒惰IBk都有很好的结果;分类准确率大于98%。
Acoustic classification using linear predictive coding for wildlife detection systems
In this work we compare different classification algorithms applied on different number of features (linear predictive coding coefficients) in order to detect audio signals from wildlife areas. The final goal is to find the appropriate number of linear predictive coding coefficients to provide the desired accuracy for a certain framework. The experimental results prove that the best classifier is Logistic Model Trees regardless the number of features, having a constant classification accuracy greater than 95%. In the case of a reduced number of features, both Random Forest and Lazy IBk have good results; the classification accuracy is greater than 98%.