{"title":"普及超媒体环境中的主动数字伙伴","authors":"Kimberly García, S. Mayer, A. Ricci, A. Ciortea","doi":"10.1109/CIC50333.2020.00017","DOIUrl":null,"url":null,"abstract":"Artificial companions and digital assistants have been investigated for several decades, from research in the autonomous agents and social robots areas to the highly popular voice-enabled digital assistants that are already in widespread use (e.g., Siri and Alexa). Although these companions provide valuable information and services to people, they remain reactive entities that operate in isolated environments waiting to be asked for help. The Web is now emerging as a uniform hypermedia fabric that interconnects everything (e.g., devices, physical objects, abstract concepts, digital services), thereby enabling unprecedented levels of automation and comfort in our professional and private lives. However, this also results in increasingly complex environments that are becoming unintelligible to everyday users. To ameliorate this situation, we envision proactive Digital Companions that take advantage of this new generation of pervasive hypermedia environments to provide assistance and protection to people. In addition to Digital Companions perceiving a person's environment through vision and sound, pervasive hypermedia environments provide them with means to further contextualize the situation by exploiting information from available connected devices, and give them access to rich knowledge bases that allow to derive relevant actions and recommendations.","PeriodicalId":265435,"journal":{"name":"2020 IEEE 6th International Conference on Collaboration and Internet Computing (CIC)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Proactive Digital Companions in Pervasive Hypermedia Environments\",\"authors\":\"Kimberly García, S. Mayer, A. Ricci, A. Ciortea\",\"doi\":\"10.1109/CIC50333.2020.00017\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Artificial companions and digital assistants have been investigated for several decades, from research in the autonomous agents and social robots areas to the highly popular voice-enabled digital assistants that are already in widespread use (e.g., Siri and Alexa). Although these companions provide valuable information and services to people, they remain reactive entities that operate in isolated environments waiting to be asked for help. The Web is now emerging as a uniform hypermedia fabric that interconnects everything (e.g., devices, physical objects, abstract concepts, digital services), thereby enabling unprecedented levels of automation and comfort in our professional and private lives. However, this also results in increasingly complex environments that are becoming unintelligible to everyday users. To ameliorate this situation, we envision proactive Digital Companions that take advantage of this new generation of pervasive hypermedia environments to provide assistance and protection to people. In addition to Digital Companions perceiving a person's environment through vision and sound, pervasive hypermedia environments provide them with means to further contextualize the situation by exploiting information from available connected devices, and give them access to rich knowledge bases that allow to derive relevant actions and recommendations.\",\"PeriodicalId\":265435,\"journal\":{\"name\":\"2020 IEEE 6th International Conference on Collaboration and Internet Computing (CIC)\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE 6th International Conference on Collaboration and Internet Computing (CIC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIC50333.2020.00017\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 6th International Conference on Collaboration and Internet Computing (CIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIC50333.2020.00017","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Proactive Digital Companions in Pervasive Hypermedia Environments
Artificial companions and digital assistants have been investigated for several decades, from research in the autonomous agents and social robots areas to the highly popular voice-enabled digital assistants that are already in widespread use (e.g., Siri and Alexa). Although these companions provide valuable information and services to people, they remain reactive entities that operate in isolated environments waiting to be asked for help. The Web is now emerging as a uniform hypermedia fabric that interconnects everything (e.g., devices, physical objects, abstract concepts, digital services), thereby enabling unprecedented levels of automation and comfort in our professional and private lives. However, this also results in increasingly complex environments that are becoming unintelligible to everyday users. To ameliorate this situation, we envision proactive Digital Companions that take advantage of this new generation of pervasive hypermedia environments to provide assistance and protection to people. In addition to Digital Companions perceiving a person's environment through vision and sound, pervasive hypermedia environments provide them with means to further contextualize the situation by exploiting information from available connected devices, and give them access to rich knowledge bases that allow to derive relevant actions and recommendations.