Mobile Phone Decoder of Two-Cell Contracted Braille for Filipino Words Using Convolutional Neural Networks

Marck Herzon C. Barrion, Lorenz Andre C. Fernando, M. Cabatuan, A. Bandala
{"title":"Mobile Phone Decoder of Two-Cell Contracted Braille for Filipino Words Using Convolutional Neural Networks","authors":"Marck Herzon C. Barrion, Lorenz Andre C. Fernando, M. Cabatuan, A. Bandala","doi":"10.1109/TENSYMP55890.2023.10223615","DOIUrl":null,"url":null,"abstract":"This paper presents the results of using MobileNetV2, EfficientNetV1, and EfficientNetV2 for decoding 25 classes of two-cell contracted Filipino braille words. Test accuracies of 91.20% and 89.20%, and F1-scores of 0.91 and 0.89 for the top two models, the EfficientNetV1B0 and EfficientNetV2B0, respectively, were acquired. Transfer learning was used from these CNN models using the weights from ImageNet for pre-training. 1250 images were used for this research, with 50 images per class. For training the models, 70% of the images were allocated, 10% for validation, and 20% for testing. An equal number of samples were allocated for each class in this arrangement of the datasets. The model was implemented in an Android phone application through TensorFlow lite to allow mobile decoding of the braille codes. This output was aimed at creating a reliable and portable platform that will aid Special Education (SPED) teachers in the Philippines in studying and teaching two-cell contracted Filipino braille words.","PeriodicalId":314726,"journal":{"name":"2023 IEEE Region 10 Symposium (TENSYMP)","volume":"156 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Region 10 Symposium (TENSYMP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TENSYMP55890.2023.10223615","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This paper presents the results of using MobileNetV2, EfficientNetV1, and EfficientNetV2 for decoding 25 classes of two-cell contracted Filipino braille words. Test accuracies of 91.20% and 89.20%, and F1-scores of 0.91 and 0.89 for the top two models, the EfficientNetV1B0 and EfficientNetV2B0, respectively, were acquired. Transfer learning was used from these CNN models using the weights from ImageNet for pre-training. 1250 images were used for this research, with 50 images per class. For training the models, 70% of the images were allocated, 10% for validation, and 20% for testing. An equal number of samples were allocated for each class in this arrangement of the datasets. The model was implemented in an Android phone application through TensorFlow lite to allow mobile decoding of the braille codes. This output was aimed at creating a reliable and portable platform that will aid Special Education (SPED) teachers in the Philippines in studying and teaching two-cell contracted Filipino braille words.
基于卷积神经网络的菲文双单元缩略语手机译码器
本文介绍了使用MobileNetV2、EfficientNetV1和EfficientNetV2对25类两单元缩略语菲文盲文进行解码的结果。效率netv1b0和效率netv2b0的测试准确率分别为91.20%和89.20%,f1得分分别为0.91和0.89。使用ImageNet的权值对这些CNN模型进行迁移学习进行预训练。本研究使用了1250张图片,每个班级50张。对于训练模型,分配了70%的图像,10%用于验证,20%用于测试。在这种数据集安排中,每个类别分配了相同数量的样本。该模型通过TensorFlow lite在Android手机应用程序中实现,允许对盲文代码进行移动解码。该输出旨在创建一个可靠的便携式平台,帮助菲律宾的特殊教育(SPED)教师学习和教授两单元菲律宾缩略语盲文。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信