{"title":"理解跨体裁节奏音频兼容性:一种计算方法","authors":"Cláudio Lemos, Diogo Cocharro, Gilberto Bernardes","doi":"10.1145/3478384.3478418","DOIUrl":null,"url":null,"abstract":"Rhythmic similarity, a fundamental task within Music Information Retrieval, has recently been applied in creative music contexts to retrieve musical audio or guide audio-content transformations. However, there is still very little knowledge of the typical rhythmic similarity values between overlapping musical structures per instrument, genre, and time scales, which we denote as rhythmic compatibility. This research provides the first steps towards the understanding of rhythmic compatibility from the systematic analysis of MedleyDB, a large multi-track musical database composed and performed by artists. We apply computational methods to compare database stems using representative rhythmic similarity metrics – Rhythmic Histogram (RH) and Beat Spectrum (BS) – per genre and instrumental families and to understand whether RH and BS are prone to discriminate genres at different time scales. Our results suggest that 1) rhythmic compatibility values lie between [.002,.354] (RH) and [.1,.881] (BS), 2) RH outperforms BS in discriminating genres, and 3) different time scale in RH and BS impose significant differences in rhythmic compatibility.","PeriodicalId":173309,"journal":{"name":"Proceedings of the 16th International Audio Mostly Conference","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Understanding Cross-Genre Rhythmic Audio Compatibility: A Computational Approach\",\"authors\":\"Cláudio Lemos, Diogo Cocharro, Gilberto Bernardes\",\"doi\":\"10.1145/3478384.3478418\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Rhythmic similarity, a fundamental task within Music Information Retrieval, has recently been applied in creative music contexts to retrieve musical audio or guide audio-content transformations. However, there is still very little knowledge of the typical rhythmic similarity values between overlapping musical structures per instrument, genre, and time scales, which we denote as rhythmic compatibility. This research provides the first steps towards the understanding of rhythmic compatibility from the systematic analysis of MedleyDB, a large multi-track musical database composed and performed by artists. We apply computational methods to compare database stems using representative rhythmic similarity metrics – Rhythmic Histogram (RH) and Beat Spectrum (BS) – per genre and instrumental families and to understand whether RH and BS are prone to discriminate genres at different time scales. Our results suggest that 1) rhythmic compatibility values lie between [.002,.354] (RH) and [.1,.881] (BS), 2) RH outperforms BS in discriminating genres, and 3) different time scale in RH and BS impose significant differences in rhythmic compatibility.\",\"PeriodicalId\":173309,\"journal\":{\"name\":\"Proceedings of the 16th International Audio Mostly Conference\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 16th International Audio Mostly Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3478384.3478418\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 16th International Audio Mostly Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3478384.3478418","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Understanding Cross-Genre Rhythmic Audio Compatibility: A Computational Approach
Rhythmic similarity, a fundamental task within Music Information Retrieval, has recently been applied in creative music contexts to retrieve musical audio or guide audio-content transformations. However, there is still very little knowledge of the typical rhythmic similarity values between overlapping musical structures per instrument, genre, and time scales, which we denote as rhythmic compatibility. This research provides the first steps towards the understanding of rhythmic compatibility from the systematic analysis of MedleyDB, a large multi-track musical database composed and performed by artists. We apply computational methods to compare database stems using representative rhythmic similarity metrics – Rhythmic Histogram (RH) and Beat Spectrum (BS) – per genre and instrumental families and to understand whether RH and BS are prone to discriminate genres at different time scales. Our results suggest that 1) rhythmic compatibility values lie between [.002,.354] (RH) and [.1,.881] (BS), 2) RH outperforms BS in discriminating genres, and 3) different time scale in RH and BS impose significant differences in rhythmic compatibility.