再现音乐性:通过部分进化来探测音乐对象和模仿音乐性

Aran V. Samson, A. Coronel
{"title":"再现音乐性:通过部分进化来探测音乐对象和模仿音乐性","authors":"Aran V. Samson, A. Coronel","doi":"10.1109/ICAIIC.2019.8669033","DOIUrl":null,"url":null,"abstract":"Musicology is a growing focus in computer science. Past research has had success in automatically generating music through learning-based agents [1] that make use of neural networks and through model and rule-based approaches [2]. These methods require a significant amount of information, either in the form of a large dataset for learning or a comprehensive set of rules based on musical concepts. This paper explores a model in which a minimal amount of musical information is needed to compose a desired style of music. This paper makes use of objectness, a concept directly derived from imagery and pattern recognition to extract specific musical objects from a single musical piece. This is then used as the foundation to produce a new generated musical piece that is similar in style to the original. The overall musical piece is generated through a partial evolution. This method eliminates the need for a large amount of pre-provided data and directly composes music based on a singular source piece.","PeriodicalId":273383,"journal":{"name":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Reproducing Musicality: Detecting Musical Objects and Emulating Musicality Through Partial Evolution\",\"authors\":\"Aran V. Samson, A. Coronel\",\"doi\":\"10.1109/ICAIIC.2019.8669033\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Musicology is a growing focus in computer science. Past research has had success in automatically generating music through learning-based agents [1] that make use of neural networks and through model and rule-based approaches [2]. These methods require a significant amount of information, either in the form of a large dataset for learning or a comprehensive set of rules based on musical concepts. This paper explores a model in which a minimal amount of musical information is needed to compose a desired style of music. This paper makes use of objectness, a concept directly derived from imagery and pattern recognition to extract specific musical objects from a single musical piece. This is then used as the foundation to produce a new generated musical piece that is similar in style to the original. The overall musical piece is generated through a partial evolution. This method eliminates the need for a large amount of pre-provided data and directly composes music based on a singular source piece.\",\"PeriodicalId\":273383,\"journal\":{\"name\":\"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"volume\":\"42 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAIIC.2019.8669033\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC.2019.8669033","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

音乐学是计算机科学中一个越来越受关注的领域。过去的研究已经成功地通过使用神经网络的基于学习的代理[1]以及基于模型和规则的方法[2]自动生成音乐。这些方法需要大量的信息,要么是用于学习的大型数据集,要么是基于音乐概念的综合规则集。本文探索了一种模型,在这种模型中,需要最少的音乐信息来创作理想的音乐风格。本文利用直接来源于意象和模式识别的客体性概念,从单个音乐片段中提取特定的音乐对象。这是然后作为基础,以产生一个新的产生的音乐作品,在风格上类似于原来的。整个音乐作品是通过局部的演变而产生的。这种方法消除了对大量预先提供的数据的需要,并直接基于单一源片段作曲。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Reproducing Musicality: Detecting Musical Objects and Emulating Musicality Through Partial Evolution
Musicology is a growing focus in computer science. Past research has had success in automatically generating music through learning-based agents [1] that make use of neural networks and through model and rule-based approaches [2]. These methods require a significant amount of information, either in the form of a large dataset for learning or a comprehensive set of rules based on musical concepts. This paper explores a model in which a minimal amount of musical information is needed to compose a desired style of music. This paper makes use of objectness, a concept directly derived from imagery and pattern recognition to extract specific musical objects from a single musical piece. This is then used as the foundation to produce a new generated musical piece that is similar in style to the original. The overall musical piece is generated through a partial evolution. This method eliminates the need for a large amount of pre-provided data and directly composes music based on a singular source piece.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信