Marck Herzon C. Barrion, Lorenz Andre C. Fernando, M. Cabatuan, A. Bandala
{"title":"基于卷积神经网络的菲文双单元缩略语手机译码器","authors":"Marck Herzon C. Barrion, Lorenz Andre C. Fernando, M. Cabatuan, A. Bandala","doi":"10.1109/TENSYMP55890.2023.10223615","DOIUrl":null,"url":null,"abstract":"This paper presents the results of using MobileNetV2, EfficientNetV1, and EfficientNetV2 for decoding 25 classes of two-cell contracted Filipino braille words. Test accuracies of 91.20% and 89.20%, and F1-scores of 0.91 and 0.89 for the top two models, the EfficientNetV1B0 and EfficientNetV2B0, respectively, were acquired. Transfer learning was used from these CNN models using the weights from ImageNet for pre-training. 1250 images were used for this research, with 50 images per class. For training the models, 70% of the images were allocated, 10% for validation, and 20% for testing. An equal number of samples were allocated for each class in this arrangement of the datasets. The model was implemented in an Android phone application through TensorFlow lite to allow mobile decoding of the braille codes. This output was aimed at creating a reliable and portable platform that will aid Special Education (SPED) teachers in the Philippines in studying and teaching two-cell contracted Filipino braille words.","PeriodicalId":314726,"journal":{"name":"2023 IEEE Region 10 Symposium (TENSYMP)","volume":"156 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Mobile Phone Decoder of Two-Cell Contracted Braille for Filipino Words Using Convolutional Neural Networks\",\"authors\":\"Marck Herzon C. Barrion, Lorenz Andre C. Fernando, M. Cabatuan, A. Bandala\",\"doi\":\"10.1109/TENSYMP55890.2023.10223615\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper presents the results of using MobileNetV2, EfficientNetV1, and EfficientNetV2 for decoding 25 classes of two-cell contracted Filipino braille words. Test accuracies of 91.20% and 89.20%, and F1-scores of 0.91 and 0.89 for the top two models, the EfficientNetV1B0 and EfficientNetV2B0, respectively, were acquired. Transfer learning was used from these CNN models using the weights from ImageNet for pre-training. 1250 images were used for this research, with 50 images per class. For training the models, 70% of the images were allocated, 10% for validation, and 20% for testing. An equal number of samples were allocated for each class in this arrangement of the datasets. The model was implemented in an Android phone application through TensorFlow lite to allow mobile decoding of the braille codes. This output was aimed at creating a reliable and portable platform that will aid Special Education (SPED) teachers in the Philippines in studying and teaching two-cell contracted Filipino braille words.\",\"PeriodicalId\":314726,\"journal\":{\"name\":\"2023 IEEE Region 10 Symposium (TENSYMP)\",\"volume\":\"156 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-09-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Region 10 Symposium (TENSYMP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TENSYMP55890.2023.10223615\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Region 10 Symposium (TENSYMP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TENSYMP55890.2023.10223615","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Mobile Phone Decoder of Two-Cell Contracted Braille for Filipino Words Using Convolutional Neural Networks
This paper presents the results of using MobileNetV2, EfficientNetV1, and EfficientNetV2 for decoding 25 classes of two-cell contracted Filipino braille words. Test accuracies of 91.20% and 89.20%, and F1-scores of 0.91 and 0.89 for the top two models, the EfficientNetV1B0 and EfficientNetV2B0, respectively, were acquired. Transfer learning was used from these CNN models using the weights from ImageNet for pre-training. 1250 images were used for this research, with 50 images per class. For training the models, 70% of the images were allocated, 10% for validation, and 20% for testing. An equal number of samples were allocated for each class in this arrangement of the datasets. The model was implemented in an Android phone application through TensorFlow lite to allow mobile decoding of the braille codes. This output was aimed at creating a reliable and portable platform that will aid Special Education (SPED) teachers in the Philippines in studying and teaching two-cell contracted Filipino braille words.