Sheikh Muhammad Saqib , Muhammad Iqbal , Tehseen Mazhar , Tariq Shahzad , Khmaies Ouahada , Habib Hamam
{"title":"Effectiveness of Teachable Machine, mobile net, and YOLO for object detection: A comparative study on practical applications","authors":"Sheikh Muhammad Saqib , Muhammad Iqbal , Tehseen Mazhar , Tariq Shahzad , Khmaies Ouahada , Habib Hamam","doi":"10.1016/j.eij.2025.100680","DOIUrl":null,"url":null,"abstract":"<div><div>In this comparative study, the effectiveness of three prominent object detection models—Teachable Machine, MobileNet, and YOLO—was evaluated using a diverse dataset consisting of images from four distinct categories: bird, horse, laptop, and sandwich. The objective was to identify the most efficient model in terms of accuracy, speed, and usability for practical applications in fields such as self-driving vehicles, robotics, security systems, and augmented reality. The dataset was meticulously curated and subjected to training across the three models. Results from the comprehensive analysis indicated that the Teachable Machine model surpassed both MobileNet and YOLO in performance, demonstrating superior accuracy and effectiveness in detecting objects across the specified categories. This research contributes significantly to the domain of artificial intelligence by providing detailed insights and comparisons of model performances, offering a valuable resource for further advancements in object detection technologies. The study not only showcases the Teachable Machine’s superiority in handling multi-class object detection problems but also sets a benchmark for future explorations in enhancing object detection methodologies.</div></div>","PeriodicalId":56010,"journal":{"name":"Egyptian Informatics Journal","volume":"30 ","pages":"Article 100680"},"PeriodicalIF":5.0000,"publicationDate":"2025-04-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Egyptian Informatics Journal","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1110866525000738","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
In this comparative study, the effectiveness of three prominent object detection models—Teachable Machine, MobileNet, and YOLO—was evaluated using a diverse dataset consisting of images from four distinct categories: bird, horse, laptop, and sandwich. The objective was to identify the most efficient model in terms of accuracy, speed, and usability for practical applications in fields such as self-driving vehicles, robotics, security systems, and augmented reality. The dataset was meticulously curated and subjected to training across the three models. Results from the comprehensive analysis indicated that the Teachable Machine model surpassed both MobileNet and YOLO in performance, demonstrating superior accuracy and effectiveness in detecting objects across the specified categories. This research contributes significantly to the domain of artificial intelligence by providing detailed insights and comparisons of model performances, offering a valuable resource for further advancements in object detection technologies. The study not only showcases the Teachable Machine’s superiority in handling multi-class object detection problems but also sets a benchmark for future explorations in enhancing object detection methodologies.
期刊介绍:
The Egyptian Informatics Journal is published by the Faculty of Computers and Artificial Intelligence, Cairo University. This Journal provides a forum for the state-of-the-art research and development in the fields of computing, including computer sciences, information technologies, information systems, operations research and decision support. Innovative and not-previously-published work in subjects covered by the Journal is encouraged to be submitted, whether from academic, research or commercial sources.