{"title":"Adaptation and Integration of GPU-Driven Physics for a Biology Research RIS","authors":"A. Knote, S. Mammen","doi":"10.1109/SEARIS44442.2018.9180233","DOIUrl":"https://doi.org/10.1109/SEARIS44442.2018.9180233","url":null,"abstract":"Developmental biology studies biophysical processes that lead to the development of tissues, organs, and organisms. Like other complex scientific domains, developmental biology can greatly benefit from real-time interactive systems (RIS). In addition to utilizing various innovative RIS technologies, highly efficient domain models have to be provided as well. In this paper, we present a prototypical RIS for developmental biology research that achieves this goal by adapting an existing, GPU-driven, position-based physics engine called FleX to support the required biological interaction mechanisms. The adapted cell model is further augmented by a GPU-based substance diffusion system that simulates biochemical signals that allow cells to communicate and react to their environment. We present model specifics with an emphasis on their efficient integration in an existing game engine, and we elaborate on future improvements.","PeriodicalId":430491,"journal":{"name":"2018 IEEE 11th Workshop on Software Engineering and Architectures for Real-time Interactive Systems (SEARIS)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125043819","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"VD1: a technical approach to a hybrid 2D and 3D desktop environment","authors":"Matthias Bues, B. Wingert, O. Riedel","doi":"10.1109/SEARIS44442.2018.9180231","DOIUrl":"https://doi.org/10.1109/SEARIS44442.2018.9180231","url":null,"abstract":"As of today, the GUI desktop and immersive virtual reality (VR) are more or less separate worlds. In particular, most VR applications targeted towards large scale 3D displays or projection systems, are standalone solutions built for a particular use case, such as CAD review or ergonomic analysis. Where required, 2D information is integrated into these 3D environments on a casual basis only, which leads to significant constraints on the possible workflows with these applications. A more tight integration of common 2D applications with VR in a common interaction environment could greatly widen the range of use cases for 3D/VR applications, as users would no longer be required to explicitly switch from a 2D (GUI-like) to a VR user interface and vice-versa. Instead, 2D and VR applications would both reside in one single interaction space, in which users work with 2D and 3D information and the associated application software in a seamless way.In this paper, we describe the Virtual Desktop One (VD1) system, our concept and implementation of a seamless 2D/3D interaction environment that integrates existing, unmodified GUI-based and 3D/VR applications into a common visualization and interaction space, which supports different interaction devices and modes. We propose a highly flexible and efficient method of VR application integration, which requires only minor modifications to existing VR application software. We describe the architecture of VD1, along with relevant implementation aspects. We further describe the evaluation by an example of VR application integration, and discuss the benchmark and usability results we have obtained so far.","PeriodicalId":430491,"journal":{"name":"2018 IEEE 11th Workshop on Software Engineering and Architectures for Real-time Interactive Systems (SEARIS)","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124066865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Lightweight Visualization and User Logging for Mobile 360-degree Videos","authors":"Antti Luoto, Pietari Heino, Yu You","doi":"10.1109/SEARIS44442.2018.9180230","DOIUrl":"https://doi.org/10.1109/SEARIS44442.2018.9180230","url":null,"abstract":"360-degree videos are getting more popular also in mobile domain. As the amount of viewers grow, it is beneficial to track what they are doing. We have 360-degree videos with object detection metadata that we want to visualize on the video. At the same time, we are interested in how the users act when watching the videos with added information. Logging the device orientation is one way to do that.We present a study about a lightweight method to visualize information on top of 360-degree videos while logging the users. The proposed visualization technique is generic and can be used for example to visualize video content related metadata or logging results on top of 360-degree video. We evaluated the work by making a proof of concept and performance analysis, which shows that FPS starts to decrease after around 2000 simultaneous visualization objects. A comparison with other existing visualization solutions suggests that our approach is lightweight.","PeriodicalId":430491,"journal":{"name":"2018 IEEE 11th Workshop on Software Engineering and Architectures for Real-time Interactive Systems (SEARIS)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123018406","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Realtime Interactive Hybrid 2D and 3D Visual Analytics on Large High Resolution Display and Immersive Virtual Environment","authors":"S. Su, Michael An, V. Perry, Michael Chen","doi":"10.1109/SEARIS44442.2018.9180229","DOIUrl":"https://doi.org/10.1109/SEARIS44442.2018.9180229","url":null,"abstract":"We present a data-flow-oriented scalable and extensible visualization system for supporting hybrid 2D and 3D visual analytics. Our application allows users to visually analyze the results of a complex multivariate Monte Carlo simulation. The simulation outputs variables describing various properties of 3D objects interacting in a dynamic 3D environment. Our system uses 2D charting tools to visualize the statistical relationships between simulation variables. We developed a Unity application to animate the 3D simulation in a virtual environment showing the time-varying results of the dynamics in a 3D environment. The Unity application runs on both a 2D high resolution display system and a complete-immersive Head Mounted Display device. The 2D visualization framework running on our Large High-Resolution Display system supports multiple coordinated views across all the 2D and 3D visualization components. Preliminary results show our data-centric design provides a user-centric visualization tool that can greatly enhance the analytical process and speed up the derivation of insights from data.","PeriodicalId":430491,"journal":{"name":"2018 IEEE 11th Workshop on Software Engineering and Architectures for Real-time Interactive Systems (SEARIS)","volume":"443 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129527742","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Florian Weidner, Luis Alejandro Rojas Vargas, W. Broll
{"title":"Composite Body-Tracking: Device Abstraction Layer with Data Fusion For Virtual Reality Applications","authors":"Florian Weidner, Luis Alejandro Rojas Vargas, W. Broll","doi":"10.1109/SEARIS44442.2018.9180232","DOIUrl":"https://doi.org/10.1109/SEARIS44442.2018.9180232","url":null,"abstract":"In a time where Virtual Reality has become increasingly popular, many companies have launched new body tracking hardware. These devices deliver various types of tracking information such as a simple full body skeleton (e.g. Microsoft Kinect v2) or a detailed hand skeleton (e.g. Leap Motion). Various device abstraction layers exist, making it easy to connect to such diverse input devices. However, when an application is connected to two or more individual input devices, these devices often provide complementary or multiple, but inconsistent tracking information. To make best use of the available data, and by that, achieving the best tracking quality, proper fusion of the information is required.Hence, we present the design and architecture of the ALVR system (Abstraction Layer for Virtual Reality). On the one hand, ALVR decouples vendor-specific interfaces from the application. On the other hand, it builds and maintains a data model representing a human skeleton. ALVR may receive data from various tracking devices. It then merges redundant tracking data and integrates unique tracking data into this data model. Thereby, application developers gain access to holistic tracking information. This integrated approach - data fusion with device abstraction - allows application developers to maximize the usage of their tracking devices while optimizing the information provided by them, e.g. for gesture recognition.","PeriodicalId":430491,"journal":{"name":"2018 IEEE 11th Workshop on Software Engineering and Architectures for Real-time Interactive Systems (SEARIS)","volume":"312 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116806246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}