{"title":"这首歌到底是关于什么的?:使用用户解释和歌词自动分类主题","authors":"Kahyun Choi, Jin Ha Lee, J. S. Downie","doi":"10.1109/JCDL.2014.6970221","DOIUrl":null,"url":null,"abstract":"Metadata research for music digital libraries has traditionally focused on genre. Despite its potential for improving the ability of users to better search and browse music collections, music subject metadata is an unexplored area. The objective of this study is to expand the scope of music metadata research, in particular, by exploring music subject classification based on user interpretations of music. Furthermore, we compare this previously unexplored form of user data to lyrics at subject prediction tasks. In our experiment, we use datasets consisting of 900 songs annotated with user interpretations. To determine the significance of performance differences between the two sources, we applied Friedman's ANOVA test on the classification accuracies. The results show that user-generated interpretations are significantly more useful than lyrics as classification features (p <; 0.05). The findings support the possibility of exploiting various existing sources for subject metadata enrichment in music digital libraries.","PeriodicalId":92278,"journal":{"name":"Proceedings of the ... ACM/IEEE Joint Conference on Digital Libraries. ACM/IEEE Joint Conference on Digital Libraries","volume":"8 1","pages":"453-454"},"PeriodicalIF":0.0000,"publicationDate":"2014-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"19","resultStr":"{\"title\":\"What is this song about anyway?: Automatic classification of subject using user interpretations and lyrics\",\"authors\":\"Kahyun Choi, Jin Ha Lee, J. S. Downie\",\"doi\":\"10.1109/JCDL.2014.6970221\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Metadata research for music digital libraries has traditionally focused on genre. Despite its potential for improving the ability of users to better search and browse music collections, music subject metadata is an unexplored area. The objective of this study is to expand the scope of music metadata research, in particular, by exploring music subject classification based on user interpretations of music. Furthermore, we compare this previously unexplored form of user data to lyrics at subject prediction tasks. In our experiment, we use datasets consisting of 900 songs annotated with user interpretations. To determine the significance of performance differences between the two sources, we applied Friedman's ANOVA test on the classification accuracies. The results show that user-generated interpretations are significantly more useful than lyrics as classification features (p <; 0.05). The findings support the possibility of exploiting various existing sources for subject metadata enrichment in music digital libraries.\",\"PeriodicalId\":92278,\"journal\":{\"name\":\"Proceedings of the ... ACM/IEEE Joint Conference on Digital Libraries. ACM/IEEE Joint Conference on Digital Libraries\",\"volume\":\"8 1\",\"pages\":\"453-454\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-09-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"19\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ... ACM/IEEE Joint Conference on Digital Libraries. ACM/IEEE Joint Conference on Digital Libraries\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/JCDL.2014.6970221\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... ACM/IEEE Joint Conference on Digital Libraries. ACM/IEEE Joint Conference on Digital Libraries","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/JCDL.2014.6970221","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
What is this song about anyway?: Automatic classification of subject using user interpretations and lyrics
Metadata research for music digital libraries has traditionally focused on genre. Despite its potential for improving the ability of users to better search and browse music collections, music subject metadata is an unexplored area. The objective of this study is to expand the scope of music metadata research, in particular, by exploring music subject classification based on user interpretations of music. Furthermore, we compare this previously unexplored form of user data to lyrics at subject prediction tasks. In our experiment, we use datasets consisting of 900 songs annotated with user interpretations. To determine the significance of performance differences between the two sources, we applied Friedman's ANOVA test on the classification accuracies. The results show that user-generated interpretations are significantly more useful than lyrics as classification features (p <; 0.05). The findings support the possibility of exploiting various existing sources for subject metadata enrichment in music digital libraries.