Yukiko Iwasaki, Joi Oh, T. Handa, Ahmed A. Sereidi, Vitvasin Vimolmongkolporn, F. Kato, H. Iwata
{"title":"Experiment Assisting System with Local Augmented Body (EASY-LAB) for Subject Experiments under the COVID-19 Pandemic","authors":"Yukiko Iwasaki, Joi Oh, T. Handa, Ahmed A. Sereidi, Vitvasin Vimolmongkolporn, F. Kato, H. Iwata","doi":"10.1145/3450550.3465345","DOIUrl":"https://doi.org/10.1145/3450550.3465345","url":null,"abstract":"Since it is challenging to perceive space and objects with a video conferencing system, which communicates using only video and audio, there are difficulties in testing subjects in the COVID-19 pandemic. We propose the EASY-LAB system that allows an experimenter to perform observation and physical interaction with the subject even from a remote location actively. The proposed system displays the camera image on a HMD worn by the experimenter, which camera is mounted on a small 6 DOF robot arm end, allowing observation from an easy-to-see perspective. The experimenter can also instruct the subject using another robot arm with a laser pointer. The robot’s joint angles are calculated by Inverse Kinematics from the experimenter’s head movements, then reflected in the actual robot. Photon Unity Networking component was used for the synchronization process with remote locations. These devices are affordable, effortless to set up, and can be delivered to the subject’s home. Finally, the proposed system was evaluated by four subjects, As a preliminary result, the mean pointing error was 1.1 cm, and the operation time was reduced by 60% compared with the conventional video conferencing system. This result indicated the EASY-LAB’s capability, at least in tasks that require pointing and observation from various angles. The statistical study with more subjects will be conducted in the follow-up study.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130834379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Health Greeter Kiosk: Tech-Enabled Signage to Encourage Face Mask Use and Social Distancing","authors":"Max Hudnell, S. King","doi":"10.1145/3450550.3465339","DOIUrl":"https://doi.org/10.1145/3450550.3465339","url":null,"abstract":"COVID-19 has been the cause of a global health crisis over the last year. High transmission rates of the virus threaten to cause a wave of infections which have the potential to overwhelm hospitals, leaving infected individuals without treatment. The World Health Organization (WHO) endorses two primary preventative measures for reducing transmission rates: the usage of face masks and adherence to social distancing [World Health Organization 2021]. In order to increase population adherence to these measures, we designed the Health Greeter Kiosk: a form of digital signage. Traditional physical signage has been used throughout the pandemic to enforce COVID-19 mandates, but lack population engagement and can easily go unnoticed. We designed this kiosk with the intent to reinforce these COVID-19 prevention mandates while also considering the necessity of population engagement. Our kiosk encourages engagement by providing visual feedback which is based on analysis from our kiosk’s computer vision software. This software integrates real-time face mask and social distance detection on a low-budget computer, without the need of a GPU. Our kiosk also collects statistics, relevant to the WHO mandates, which can be used to develop well-informed reopening strategies.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"62 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129926465","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ben Boudaoud, Pyarelal Knowles, Joohwan Kim, J. Spjut
{"title":"Gaming at Warp Speed: Improving Aiming with Late Warp","authors":"Ben Boudaoud, Pyarelal Knowles, Joohwan Kim, J. Spjut","doi":"10.1145/3450550.3465347","DOIUrl":"https://doi.org/10.1145/3450550.3465347","url":null,"abstract":"Latency can make all the difference in competitive online games. Late warp is a class of techniques used in VR that can reduce latency in FPS games as well. Prior work has demonstrated these techniques can recover most of the player performance lost to computer or network latency. Inspired by work demonstrating the usefulness of late warp as a potential solution to FPS latency, we provide an interactive demonstration, playable in a web browser, that shows how much latency limits aiming performance, and how late warp can help.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125686419","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Karmanova, Valerii Serpiva, S. Perminov, R. Ibrahimov, A. Fedoseev, D. Tsetserukou
{"title":"SwarmPlay: A Swarm of Nano-Quadcopters Playing Tic-tac-toe Board Game against a Human","authors":"E. Karmanova, Valerii Serpiva, S. Perminov, R. Ibrahimov, A. Fedoseev, D. Tsetserukou","doi":"10.1145/3450550.3465346","DOIUrl":"https://doi.org/10.1145/3450550.3465346","url":null,"abstract":"We present a new paradigm of games, i.e. SwarmPlay, where each playing component is presented by an individual drone that has its own mobility and swarm intelligence to win against a human player. The motivation behind the research is to make the games with machines tangible and interactive. Although some research on the robotic players for board games already exists, e.g., chess, the SwarmPlay technology has the potential to offer much more engagement and interaction with a human as it proposes a multi-agent swarm instead of a single interactive robot. The proposed system consists of a robotic swarm, a workstation, a computer vision (CV), and Game Theory-based algorithms. A novel game algorithm was developed to provide a natural game experience to the user. The preliminary user study revealed that participants were highly engaged in the game with drones (69% put a maximum score on the Likert scale) and found it less artificial compared to the regular computer-based systems (77% put maximum score). The affection of the user’s game perception from its outcome was analyzed and put under discussion. User study revealed that SwarmPlay has the potential to be implemented in a wider range of games, significantly improving human-drone interactivity.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129004338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Valerii Serpiva, E. Karmanova, A. Fedoseev, S. Perminov, D. Tsetserukou
{"title":"DronePaint: Swarm Light Painting with DNN-based Gesture Recognition","authors":"Valerii Serpiva, E. Karmanova, A. Fedoseev, S. Perminov, D. Tsetserukou","doi":"10.1145/3450550.3465349","DOIUrl":"https://doi.org/10.1145/3450550.3465349","url":null,"abstract":"We propose a novel human-swarm interaction system, allowing the user to directly control a swarm of drones in a complex environment through trajectory drawing with a hand gesture interface based on the DNN-based gesture recognition. The developed CV-based system allows the user to control the swarm behavior without additional devices through human gestures and motions in real-time, providing convenient tools to change the swarm’s shape and formation. The two types of interaction were proposed and implemented to adjust the swarm hierarchy: trajectory drawing and free-form trajectory generation control. The experimental results revealed a high accuracy of the gesture recognition system (99.75%), allowing the user to achieve relatively high precision of the trajectory drawing (mean error of 5.6 cm in comparison to 3.1 cm by mouse drawing) over the three evaluated trajectory patterns. The proposed system can be potentially applied in complex environment exploration, spray painting using drones, and interactive drone shows, allowing users to create their own art objects by drone swarms.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"500 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124449926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}