Gérard Assayag, Laurent Bonnasse-Gahot, Joakim Borg
{"title":"创意互动:Somax2 和 REACH 项目","authors":"Gérard Assayag, Laurent Bonnasse-Gahot, Joakim Borg","doi":"10.1162/comj_a_00662","DOIUrl":null,"url":null,"abstract":"Somax2 is an AI-based multiagent system for human-machine coimprovisation that generates stylistically coherent streams while continuously listening and adapting to musicians or other agents. The model on which it is based can be used with little configuration to interact with humans in full autonomy, but it also allows fine real-time control of its generative processes and interaction strategies, closer in this case to a “smart” digital instrument. An offspring of the Omax system, conceived at the Institut de Recherche et Coordination Acoustique/Musique, the Somax2 environment is part of the European Research Council Raising Cocreativity in Cyber-Human Musicianship (REACH) project, which studies distributed creativity as a general template for symbiotic interaction between humans and digital systems. It fosters mixed musical reality involving cocreative AI agents. The REACH project puts forward the idea that cocreativity in cyber-human systems results from the emergence of complex joint behavior, produced by interaction and featuring cross-learning mechanisms. Somax2 is a first step toward this ideal, and already shows life-size achievements. This article describes Somax2 extensively, from its theoretical model to its system architecture, through its listening and learning strategies, representation spaces, and interaction policies.","PeriodicalId":50639,"journal":{"name":"Computer Music Journal","volume":"15 1","pages":""},"PeriodicalIF":0.4000,"publicationDate":"2024-01-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Cocreative Interaction: Somax2 and the REACH Project\",\"authors\":\"Gérard Assayag, Laurent Bonnasse-Gahot, Joakim Borg\",\"doi\":\"10.1162/comj_a_00662\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Somax2 is an AI-based multiagent system for human-machine coimprovisation that generates stylistically coherent streams while continuously listening and adapting to musicians or other agents. The model on which it is based can be used with little configuration to interact with humans in full autonomy, but it also allows fine real-time control of its generative processes and interaction strategies, closer in this case to a “smart” digital instrument. An offspring of the Omax system, conceived at the Institut de Recherche et Coordination Acoustique/Musique, the Somax2 environment is part of the European Research Council Raising Cocreativity in Cyber-Human Musicianship (REACH) project, which studies distributed creativity as a general template for symbiotic interaction between humans and digital systems. It fosters mixed musical reality involving cocreative AI agents. The REACH project puts forward the idea that cocreativity in cyber-human systems results from the emergence of complex joint behavior, produced by interaction and featuring cross-learning mechanisms. Somax2 is a first step toward this ideal, and already shows life-size achievements. This article describes Somax2 extensively, from its theoretical model to its system architecture, through its listening and learning strategies, representation spaces, and interaction policies.\",\"PeriodicalId\":50639,\"journal\":{\"name\":\"Computer Music Journal\",\"volume\":\"15 1\",\"pages\":\"\"},\"PeriodicalIF\":0.4000,\"publicationDate\":\"2024-01-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Music Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1162/comj_a_00662\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Music Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1162/comj_a_00662","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Cocreative Interaction: Somax2 and the REACH Project
Somax2 is an AI-based multiagent system for human-machine coimprovisation that generates stylistically coherent streams while continuously listening and adapting to musicians or other agents. The model on which it is based can be used with little configuration to interact with humans in full autonomy, but it also allows fine real-time control of its generative processes and interaction strategies, closer in this case to a “smart” digital instrument. An offspring of the Omax system, conceived at the Institut de Recherche et Coordination Acoustique/Musique, the Somax2 environment is part of the European Research Council Raising Cocreativity in Cyber-Human Musicianship (REACH) project, which studies distributed creativity as a general template for symbiotic interaction between humans and digital systems. It fosters mixed musical reality involving cocreative AI agents. The REACH project puts forward the idea that cocreativity in cyber-human systems results from the emergence of complex joint behavior, produced by interaction and featuring cross-learning mechanisms. Somax2 is a first step toward this ideal, and already shows life-size achievements. This article describes Somax2 extensively, from its theoretical model to its system architecture, through its listening and learning strategies, representation spaces, and interaction policies.
期刊介绍:
Computer Music Journal is published quarterly with an annual sound and video anthology containing curated music¹. For four decades, it has been the leading publication about computer music, concentrating fully on digital sound technology and all musical applications of computers. This makes it an essential resource for musicians, composers, scientists, engineers, computer enthusiasts, and anyone exploring the wonders of computer-generated sound.
Edited by experts in the field and featuring an international advisory board of eminent computer musicians, issues typically include:
In-depth articles on cutting-edge research and developments in technology, methods, and aesthetics of computer music
Reports on products of interest, such as new audio and MIDI software and hardware
Interviews with leading composers of computer music
Announcements of and reports on conferences and courses in the United States and abroad
Publication, event, and recording reviews
Tutorials, letters, and editorials
Numerous graphics, photographs, scores, algorithms, and other illustrations.