{"title":"现场表演创作中的紧密耦合代理","authors":"William Marley, Nicholas Ward","doi":"10.1145/2757226.2757255","DOIUrl":null,"url":null,"abstract":"We consider how the application of AI in digital musical instruments might maximally support exploration of sound in performance. Live performance applications of AI and machine learning have tended to focus on score following and the development of machine collaborators. In our work we are interested in exploring the development of systems whereby the human performer interacts with a reactive and creative agent in the creation of a single sonic output. The intention is to design systems that foster exploration and allow for greater (than with acoustic instruments) opportunities for serendipitous musical encounters. An initial approach to the integration of autonomous agency, based on gesture reshaping schemes within the Reactable performance system, is first outlined. We then describe a simple platform based on the non-player characters within Pacman, which serves as a test bed for guiding further discussion on what musical machine collaboration at this level may entail. Pilot studies for both systems are outlined.","PeriodicalId":231794,"journal":{"name":"Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2015-06-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Tightly Coupled Agents in Live Performance Metacreations\",\"authors\":\"William Marley, Nicholas Ward\",\"doi\":\"10.1145/2757226.2757255\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We consider how the application of AI in digital musical instruments might maximally support exploration of sound in performance. Live performance applications of AI and machine learning have tended to focus on score following and the development of machine collaborators. In our work we are interested in exploring the development of systems whereby the human performer interacts with a reactive and creative agent in the creation of a single sonic output. The intention is to design systems that foster exploration and allow for greater (than with acoustic instruments) opportunities for serendipitous musical encounters. An initial approach to the integration of autonomous agency, based on gesture reshaping schemes within the Reactable performance system, is first outlined. We then describe a simple platform based on the non-player characters within Pacman, which serves as a test bed for guiding further discussion on what musical machine collaboration at this level may entail. Pilot studies for both systems are outlined.\",\"PeriodicalId\":231794,\"journal\":{\"name\":\"Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-06-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2757226.2757255\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2015 ACM SIGCHI Conference on Creativity and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2757226.2757255","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Tightly Coupled Agents in Live Performance Metacreations
We consider how the application of AI in digital musical instruments might maximally support exploration of sound in performance. Live performance applications of AI and machine learning have tended to focus on score following and the development of machine collaborators. In our work we are interested in exploring the development of systems whereby the human performer interacts with a reactive and creative agent in the creation of a single sonic output. The intention is to design systems that foster exploration and allow for greater (than with acoustic instruments) opportunities for serendipitous musical encounters. An initial approach to the integration of autonomous agency, based on gesture reshaping schemes within the Reactable performance system, is first outlined. We then describe a simple platform based on the non-player characters within Pacman, which serves as a test bed for guiding further discussion on what musical machine collaboration at this level may entail. Pilot studies for both systems are outlined.