{"title":"Structural Equation Modeling for Quantifying Riding Performance of Motorcycle Rider using Real-time Measurable Indexes","authors":"Saya Kishino, Joohyeong Lee, Keisuke Suzuki","doi":"10.5057/ijae.tjske-d-20-00073","DOIUrl":"https://doi.org/10.5057/ijae.tjske-d-20-00073","url":null,"abstract":"Motorcycle riders’ fatality is four times that of four-wheeled vehicle drivers. Previous studies have shown the effect of the Advanced Rider Assistance System (ARAS) is different depending on the user’s driving style. To realize personally optimized ARAS, it needs to keep track of riding performance and emotional state. Most studies define one index as driving performance to control the onset timing of ARAS. In this study, we designed a structural equation model to identify the driving behavior indexes that are directly related to the risk of traffic accidents from the emotional state and driving behavior. We investigated the driving behaviors of 23 test subjects using a riding simulator by inducing various emotional states in different conditions of driving scenery, traffic volume, and music. As a result, this model suggests that arousal level, valence level, carelessness, lateral instability, steering instability, and driving style are related to riding performance.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"1 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70687815","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Emotional Responses Towards Unity YouTube Videos: Experts vs. Viewers Perspectives","authors":"Shamsiah Abd Kadir, A. Lokman, T. Tsuchiya","doi":"10.5057/IJAE.IJAE-D-20-00033","DOIUrl":"https://doi.org/10.5057/IJAE.IJAE-D-20-00033","url":null,"abstract":": YouTube video is one of the most popular social media channels used to attract people’s attention as it has its own “communication power” through the broadcasted videos. With the right approach to create an influential video that embeds emotional elements, YouTube video could be used as a medium to disseminate information that could influence people’s emotion. The study as reported in this paper attempted to understand people’s emotional responses towards videos posted on YouTube and how it could influence people’s unity. The study conducted an in-depth interview with 3 experts, to determine valid specimen for the investigation, and then elaborate the cause and effect of each specimen from their expert point of view. A Thematic Analysis (TA) was then performed to identify the concept of emotion and the video design elements that shaped the classification of emotion from the experts’ perspective. The study then performed a focus group study using the valid specimens with 6 viewers using the Evaluation Grid Method of Laddering (EGML) approach to discover the emotional concept and design elements in the videos from the viewers’ perspective. The results obtained from both sessions were synthesised to find agreements and finally conclude a taxonomy of Kansei Words (KWs) related to the concept of unity, and the video design elements that affect the concept. An in-depth interview with the experts has resulted in 17 valid videos reckoned to embed design elements that foster unity. The TA conducted then has identified 61 KWs and a total of 10 Items and 88 Categories of design elements. Meanwhile, EGML analysis has resulted in a total of 64-items of KWs and 14 Items and 76 Categories of design elements. The study then conducted a confirmatory analysis of both dataset and successfully synthesised 36 KWs, as well as 9 Items and 142 Categories of design elements. The final set of KWs and design elements are from the point of agreement/validation from both experts’ and viewers’ perspectives, and thus matches their implicit image of designs and the influential elements contributing to the implicit images. These become a sound clue and could be referred to as a reciprocal understanding of the concept of video design, which could lead to effective strategies to achieve people’s unity.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"1 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70693413","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Analysis of Impression Evaluation of Fragrances Associated with ‘Omotenashi’ in Elderly People using Perfumes","authors":"Harumi Nakagawa, N. Kuwahara","doi":"10.5057/ijae.ijae-d-20-00016","DOIUrl":"https://doi.org/10.5057/ijae.ijae-d-20-00016","url":null,"abstract":": In this study, we used 6 different fragrances to better understand the tendency of scents to elicit a sense of omotenashi (hospitality) in elderly people. To first understand how elderly people perceive scents associated with feelings of omotenashi we performed an impression evaluation using SD method of 22 adjective pairs. We then analyzed the results taken from a questionnaire survey we conducted on 51 elderly Japanese men and women using a factor analysis, and extracted the factors that represent sensations of scents which produced a sense of omotenashi . Our subsequent examination of the main underlying factors indicated that scents with sensations of freshness, adulthood and pleasure hinted a sense of omotenashi in scents for elderly people. We also noted differences in the way the scent was perceived compared to young people.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"1 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2020-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"42887020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Effects of Scent and Music on Moods and Stress in the Confined Resting Area","authors":"Chen Zhou, F. Shutoh, H. Komada, T. Yamanaka","doi":"10.5057/ijae.ijae-d-19-00017","DOIUrl":"https://doi.org/10.5057/ijae.ijae-d-19-00017","url":null,"abstract":": Scent and music are acknowledged to generate physiological and psychological changes on people, and the combination of scent and music is exerted in diverse situations. However, effects and utilization of scent and music in confined resting environment have not been examined exhaustively. To investigate the effects of scent and music on people’s moods and stress in confined resting areas, towards a better understanding about the olfactory and auditory environment, we tested six combinations of scent and music. The results indicated that music hardly changed people’s stress evaluation, but had significant effects on moods. On the other hand, scent dramatically affected both moods and stress assessment. Profound comprehension about scent and music will offer further inspiration for the design of olfactory and auditory environment in confined spaces, and new knowledge and perspectives gained through this exploratory study will serve to the thorough and supplementary research in the future.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":" ","pages":""},"PeriodicalIF":0.3,"publicationDate":"2020-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"43262247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Font Comparison System Based on Multiple Similarity Metrics","authors":"Shota Takizawa, Taisei Hoshi, Qiu Chen","doi":"10.5057/ijae.ijae-d-19-00012","DOIUrl":"https://doi.org/10.5057/ijae.ijae-d-19-00012","url":null,"abstract":": Digital fonts are widely used in various fields such as documents, webpages, movie subtitles, etc., thus, how to select the best font for the content is an important issue. For the same font, the similarity calculated using different metrics is different. In this paper, we propose a font comparison system that considers different similarities by sorting the similarity of each font. For measuring similarity, we use MSE, PSNR, SSIM, and HaarPSI for image quality assessment, as well as Euclidean distance and cosine similarity in t-SNE to reduce the number of dimensions. We evaluate how the correlation to the font order of each comparison method changes depending on the resolution of the image, the character type, and the comparison method.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":" ","pages":""},"PeriodicalIF":0.3,"publicationDate":"2020-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"44874827","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Shape Recognition and Center of Mass","authors":"Emika Okumura, T. Yamanaka","doi":"10.5057/ijae.ijae-d-19-00023","DOIUrl":"https://doi.org/10.5057/ijae.ijae-d-19-00023","url":null,"abstract":"","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"19 1","pages":"207-216"},"PeriodicalIF":0.3,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70684171","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Potential Factors and Satisfaction of Readers in Social Reading Using Other's Comments","authors":"Kazuya Matsumura, H. Nunokawa, Kiwamu Sato","doi":"10.5057/ijae.ijae-d-19-00013","DOIUrl":"https://doi.org/10.5057/ijae.ijae-d-19-00013","url":null,"abstract":"With the ever-growing “electronic book” (e-book) market, more people are reading e-books. Along with that, social reading, for its social sharing of e-books, has gained more attention; readers may read an e-book while viewing comments on the contents of the e-book. We evaluated the impression of social reading using an experimental system to clarify its effect. Furthermore, we performed factor and customer satisfaction analyses to evaluate the impression. The results indicated that reading while viewing comments is valuable as it gives a sense of excitement and new discoveries. Moreover, we realized that there is a need to improve the method of giving readers satisfaction to “comment writing and enjoyment”.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"19 1","pages":"119-125"},"PeriodicalIF":0.3,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70683917","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shoichi Nishio, B. Hossain, M. Nii, N. Yagi, T. Hiranaka, Syoji Kobashi
{"title":"Surgical Phase Recognition with Wearable Video Camera for Computer-aided Orthopaedic Surgery-AI Navigation System","authors":"Shoichi Nishio, B. Hossain, M. Nii, N. Yagi, T. Hiranaka, Syoji Kobashi","doi":"10.5057/ijae.ijae-d-19-00018","DOIUrl":"https://doi.org/10.5057/ijae.ijae-d-19-00018","url":null,"abstract":"The procedure of orthopaedic surgery is quite complicated, and many kinds of equipment have been used. Operating room nurses who deliver surgical instruments to surgeon are supposed to be forced to incur a heavy burden. This study aims to offer a computer-aided orthopaedic surgery (CAOS)-AI navigation system, which assists operating room nurses by suggesting the current progress of the procedure and expected surgical instruments. This paper proposes a method for recognizing the current phase of orthopaedic procedures from surgeon-wearable video camera images. The method plays the fundamental role of CAOS-AI navigation system. The proposed method is based on a convolutional-long short-term memory (LSTM) network. We also investigate the efficient CNN model in some competitive models such as VGG16, DenseNet, and ResNet to improve the recognition accuracy. Experimental results in unicomapartmenatal knee arthroplasty (UKA) surgeries showed that the proposed method achieved a phase recognition accuracy with 48.2%, 41.2%, and 53.6% using VGG16, DenseNet, and ResNet, respectively.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"1 1","pages":""},"PeriodicalIF":0.3,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70683942","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Screening of Track Driver’s Sleep Apnea by Objective Measure and Subjective Sense of Sleep Quality","authors":"E. Yuda, Y. Yoshida, J. Hayano","doi":"10.5057/ijae.ijae-d-19-00008","DOIUrl":"https://doi.org/10.5057/ijae.ijae-d-19-00008","url":null,"abstract":"Usefulness of subjective sleep quality assessment by a questionnaire (OSA-MA sleep inventory) was examined in ten track drivers (age, 42 ± 12 y, range, 23-62 y) in reference to the objective measure of sleep apnea by cyclic variation of heart rate (CVHR) in electrocardiogram (ECG) during sleep. Total CVHR suggesting moderate-to-severe sleep apnea (average over total time in bed > 15 cycles/h) was observed only in one subject and the transient occurrence of frequent CVHR (≥ 50 cycle/h) was detected in the same subject and two other subjects. The questionnaire provided the standardized scores of five features of subjective sleep quality, including less sleepiness on rising, good initiation and maintenance of sleep, less frequency of dreaming, refreshing feeling, and subjective sleep length as factors 1-5, respectively. The subject with high average CVHR showed factor scores < -1 SD for factors 1, 2, and 3 and reported subjective sleepiness during driving. In the two subjects with transient frequent CVHR, one showed factor score < -1 SD for factors 3 and 5, while the other did not show score < -1 SD for any of the factors. Although this is preliminary study in a small sample size, it suggests the possible associations between the subjective assessment of sleep quality and the objective measure of CVHR.","PeriodicalId":41579,"journal":{"name":"International Journal of Affective Engineering","volume":"19 1","pages":"79-82"},"PeriodicalIF":0.3,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"70683969","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}