{"title":"Real-time vibrotactile pattern generation and identification using discrete event-driven feedback.","authors":"İsmail Erbaş, Burak Güçlü","doi":"10.1080/08990220.2023.2175811","DOIUrl":null,"url":null,"abstract":"<p><p>This study assesses human identification of vibrotactile patterns by using real-time discrete event-driven feedback. Previously acquired force and bend sensor data from a robotic hand were used to predict movement-type (stationary, flexion, contact, extension, release) and object-type (no object, hard object, soft object) states by using decision tree (DT) algorithms implemented in a field-programmable gate array (FPGA). Six able-bodied humans performed a 2- and 3-step sequential pattern recognition task in which state transitions were signaled as vibrotactile feedback. The stimuli were generated according to predicted classes represented by two frequencies (F1: 80 Hz, F2: 180 Hz) and two magnitudes (M1: low, M2: high) calibrated psychophysically for each participant; and they were applied by two actuators (Haptuators) placed on upper arms. A soft/hard object was mapped to F1/F2; and manipulating it with low/high force was assigned to M1/M2 in the left actuator. On the other hand, flexion/extension movement was mapped to F1/F2 in the right actuator, with movement in air as M1 and during object manipulation as M2. DT algorithm performed better for the object-type (97%) than the movement-type (88%) classification in real time. Participants could recognize feedback associated with 14 discrete-event sequences with low-to-medium accuracy. The performance was higher (76 ± 9% recall, 76 ± 17% precision, 78 ± 4% accuracy) for recognizing any one event in the sequences. The results show that FPGA implementation of classification for discrete event-driven vibrotactile feedback can be feasible in haptic devices with additional cues in the physical context.</p>","PeriodicalId":49498,"journal":{"name":"Somatosensory and Motor Research","volume":" ","pages":"77-89"},"PeriodicalIF":1.3000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Somatosensory and Motor Research","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1080/08990220.2023.2175811","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/2/7 0:00:00","PubModel":"Epub","JCR":"Q4","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
This study assesses human identification of vibrotactile patterns by using real-time discrete event-driven feedback. Previously acquired force and bend sensor data from a robotic hand were used to predict movement-type (stationary, flexion, contact, extension, release) and object-type (no object, hard object, soft object) states by using decision tree (DT) algorithms implemented in a field-programmable gate array (FPGA). Six able-bodied humans performed a 2- and 3-step sequential pattern recognition task in which state transitions were signaled as vibrotactile feedback. The stimuli were generated according to predicted classes represented by two frequencies (F1: 80 Hz, F2: 180 Hz) and two magnitudes (M1: low, M2: high) calibrated psychophysically for each participant; and they were applied by two actuators (Haptuators) placed on upper arms. A soft/hard object was mapped to F1/F2; and manipulating it with low/high force was assigned to M1/M2 in the left actuator. On the other hand, flexion/extension movement was mapped to F1/F2 in the right actuator, with movement in air as M1 and during object manipulation as M2. DT algorithm performed better for the object-type (97%) than the movement-type (88%) classification in real time. Participants could recognize feedback associated with 14 discrete-event sequences with low-to-medium accuracy. The performance was higher (76 ± 9% recall, 76 ± 17% precision, 78 ± 4% accuracy) for recognizing any one event in the sequences. The results show that FPGA implementation of classification for discrete event-driven vibrotactile feedback can be feasible in haptic devices with additional cues in the physical context.
期刊介绍:
Somatosensory & Motor Research publishes original, high-quality papers that encompass the entire range of investigations related to the neural bases for somatic sensation, somatic motor function, somatic motor integration, and modeling thereof. Comprising anatomical, physiological, biochemical, pharmacological, behavioural, and psychophysical studies, Somatosensory & Motor Research covers all facets of the peripheral and central processes underlying cutaneous sensation, and includes studies relating to afferent and efferent mechanisms of deep structures (e.g., viscera, muscle). Studies of motor systems at all levels of the neuraxis are covered, but reports restricted to non-neural aspects of muscle generally would belong in other journals.