{"title":"Graphic user interface and front-end operation on MS Windows","authors":"Pongkan Kansong, Darika Maneechai, Pichaya Tandayya, Chatchai Jantaraprim, Wiraman Niyompol","doi":"10.1145/1328491.1328505","DOIUrl":"https://doi.org/10.1145/1328491.1328505","url":null,"abstract":"This work is the development of a Microsoft Windows Graphic User Interface and front-end operation program. This program integrates translation engines that translate between Braille Mathematical and Scientific texts and texts with the extensible markup languages, e.g. Chemical Markup Language and Mathematical Markup Language. This program has a graphical user interface and works on the Microsoft Windows operating system.","PeriodicalId":241320,"journal":{"name":"International Convention on Rehabilitation Engineering & Assistive Technology","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132418324","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Augmented reality and applications for assistive technology","authors":"S. Gaukrodger, A. Lintott","doi":"10.1145/1328491.1328504","DOIUrl":"https://doi.org/10.1145/1328491.1328504","url":null,"abstract":"Augmented Reality (AR) adds information to the environment in order to facilitate human-computer and human-environment interactions. This paper describes AR and potential applications in Assistive Technology (AT).","PeriodicalId":241320,"journal":{"name":"International Convention on Rehabilitation Engineering & Assistive Technology","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129905376","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Seonhong Hwang, Sungjae Hwang, Youngeun Kim, Youngho Kim
{"title":"Lower extremity joint moments during symmetric lifting: squat vs stoop","authors":"Seonhong Hwang, Sungjae Hwang, Youngeun Kim, Youngho Kim","doi":"10.1145/1328491.1328518","DOIUrl":"https://doi.org/10.1145/1328491.1328518","url":null,"abstract":"In this study, we analyzed joint moments during the symmetrical lifting in two different postures, using the three-dimensional motion analysis. Boxes weighing 5, 10 and 15kg were lifted by both squat and stoop techniques. The ankle moment in stoop was always larger than that in squat and the support moment was the largest at the end of the lifting in both techniques. The knee flexion moment played an important role in stoop lifting to support the lower limbs. In the end stage of the lifting, the hip joint showed less contributions on the support moment in both lifting techniques. However, the maximum hip extension moment in stoop lifting was larger than that in squat. In addition, the maximum waist moment in squat was larger than in stoop. Therefore, these results could support the previous research that the squat lifting was not the best strategy with no harm to the waist. It is expected that these results could provide a basic information to analyze and propose an efficient lifting strategy.","PeriodicalId":241320,"journal":{"name":"International Convention on Rehabilitation Engineering & Assistive Technology","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123348332","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Wearable interface for the physical disabled","authors":"Y. Chi, S. Ong, M. Yuan, A. Nee","doi":"10.1145/1328491.1328500","DOIUrl":"https://doi.org/10.1145/1328491.1328500","url":null,"abstract":"A novel wearable interface is developed to help the disabled fully interact with the computer and operate home appliances. It is practical, natural, and less prone to fatigue, and it can serve as a general-purpose interface. Compared with interfaces based on image processing, infrared, and laser techniques, the current interface is relatively insensitive to the surrounding conditions and is more flexible and robust. Since it is able to perform accurate and stable inertia motion measurement, it can move with the subject and measure the motion directly. Moreover, it allows the disabled to actively interact with the computer since the patient's body motions can be analyzed and converted into computer commands. A survey is conducted to evaluate the proposed system and the results are encouraging.","PeriodicalId":241320,"journal":{"name":"International Convention on Rehabilitation Engineering & Assistive Technology","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124526689","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The effect of pauses in dysarthric speech recognition study on Thai cerebral palsy children","authors":"Supawat Suanpirintr, Nuttakorn Thubthong","doi":"10.1145/1328491.1328530","DOIUrl":"https://doi.org/10.1145/1328491.1328530","url":null,"abstract":"Dysarthric speech recognition (DSR) is continuously developed to improve the quality of life of people with speech impairment. This study aimed to investigate the effect of pauses in DSR. Speech corpus consists of 40 words including two subsets, (i) 20 bisyllabic words with specific design in order to contain all types of final consonant-initial consonant junction in Thai language and (ii) 20 monosyllabic words, which have some phoneme similar to that of the previous subset. Four cerebral palsy children with dysarthria and two normal children were participated. DSR was trained by using Hidden Markov Models (HMMs) in 3 approaches: phoneme-based (PSR), word-based (WSR), and pause reducing word-based (PRWSR). For the third approach, the pauses in words were automatically detected and reduced. The accuracy for PRWSR was compared with that of WSR by varying the duration of remaining pauses in PRWSR. Speech samples from the normal children were also recognized for comparing the accuracy. The results showed that PSR provided the highest recognition rate. The recognition rates of WSR and PRWSR are not significantly different but PRWSR grants a bit higher recognition rate than WSR. Comparing the remaining pause duration, 100 ms remaining pause duration is better than any other duration.","PeriodicalId":241320,"journal":{"name":"International Convention on Rehabilitation Engineering & Assistive Technology","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2007-04-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133146483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}