{"title":"基于一般声音形式的声音设计的创作过程","authors":"G. Mazzola, F. Thalmann","doi":"10.1609/aiide.v9i5.12659","DOIUrl":null,"url":null,"abstract":"Building on recent research in musical creativity and the composition process, this paper presents a specific practical application of our theory and software to sound design. The BigBang rubette module that brings gestural music composition methods to the Rubato Composer software was recently generalized in order to work with any kinds of musical and non-musical objects. Here, we focus on time-independent sound objects to illustrate several levels of metacreativity. On the one hand, we show a sample process of designing the sound objects themselves by defining appropriate datatypes, which can be done at runtime. On the other hand, we demonstrate how the creative process itself, recorded by the software once the composer starts working with these sound objects, can be used for both improvisation with and automation of any defined operations and transformations.","PeriodicalId":249108,"journal":{"name":"Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment","volume":"66 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Using the Creative Process for Sound Design Based on Generic Sound Form\",\"authors\":\"G. Mazzola, F. Thalmann\",\"doi\":\"10.1609/aiide.v9i5.12659\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Building on recent research in musical creativity and the composition process, this paper presents a specific practical application of our theory and software to sound design. The BigBang rubette module that brings gestural music composition methods to the Rubato Composer software was recently generalized in order to work with any kinds of musical and non-musical objects. Here, we focus on time-independent sound objects to illustrate several levels of metacreativity. On the one hand, we show a sample process of designing the sound objects themselves by defining appropriate datatypes, which can be done at runtime. On the other hand, we demonstrate how the creative process itself, recorded by the software once the composer starts working with these sound objects, can be used for both improvisation with and automation of any defined operations and transformations.\",\"PeriodicalId\":249108,\"journal\":{\"name\":\"Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment\",\"volume\":\"66 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1609/aiide.v9i5.12659\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1609/aiide.v9i5.12659","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using the Creative Process for Sound Design Based on Generic Sound Form
Building on recent research in musical creativity and the composition process, this paper presents a specific practical application of our theory and software to sound design. The BigBang rubette module that brings gestural music composition methods to the Rubato Composer software was recently generalized in order to work with any kinds of musical and non-musical objects. Here, we focus on time-independent sound objects to illustrate several levels of metacreativity. On the one hand, we show a sample process of designing the sound objects themselves by defining appropriate datatypes, which can be done at runtime. On the other hand, we demonstrate how the creative process itself, recorded by the software once the composer starts working with these sound objects, can be used for both improvisation with and automation of any defined operations and transformations.