Saravanan Alagarsamy, Prudhivi Deepak, Lavanya M, T. G. Reddy, M. Kedareswari, A. Senthil Kumar
{"title":"Smart Vision Software Application using Machine Learning","authors":"Saravanan Alagarsamy, Prudhivi Deepak, Lavanya M, T. G. Reddy, M. Kedareswari, A. Senthil Kumar","doi":"10.1109/ICAIS56108.2023.10073814","DOIUrl":null,"url":null,"abstract":"The Smart Vision Application's premise is that there are numerous rising new technologies that are excelling in their fields. The following are a few of the technologies or models that are now in use: estimation of human pose, steering angle capture, lane detection, and object detection. These are all the various approaches and superb models created with Open Pose and other tools. Since each of these systems has unique characteristics, it is vital to separately construct each one before comprehending how it works. Because there is no trial version available for consumers to use to learn how the model works, these models must be constructed using codes creating a web application that will enable students to learn about and experience how each model functions by using the camera on their device. For business professionals who can use their own models to run, deploy, and test, not simply for users. Every module on the list has some connection to autonomous navigation. These systems have been combined into a single Web application so that students may easily experiment with them and see how they work in real-time. As a result, this platform presents excellent opportunities for students and enthusiastic learners to interact with the live demo and understand how each model functions. It is believed that the Web application will serve as an excellent tool for students to experiment with and gain a feel for the operation of the aforementioned computer vision models.","PeriodicalId":164345,"journal":{"name":"2023 Third International Conference on Artificial Intelligence and Smart Energy (ICAIS)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 Third International Conference on Artificial Intelligence and Smart Energy (ICAIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIS56108.2023.10073814","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The Smart Vision Application's premise is that there are numerous rising new technologies that are excelling in their fields. The following are a few of the technologies or models that are now in use: estimation of human pose, steering angle capture, lane detection, and object detection. These are all the various approaches and superb models created with Open Pose and other tools. Since each of these systems has unique characteristics, it is vital to separately construct each one before comprehending how it works. Because there is no trial version available for consumers to use to learn how the model works, these models must be constructed using codes creating a web application that will enable students to learn about and experience how each model functions by using the camera on their device. For business professionals who can use their own models to run, deploy, and test, not simply for users. Every module on the list has some connection to autonomous navigation. These systems have been combined into a single Web application so that students may easily experiment with them and see how they work in real-time. As a result, this platform presents excellent opportunities for students and enthusiastic learners to interact with the live demo and understand how each model functions. It is believed that the Web application will serve as an excellent tool for students to experiment with and gain a feel for the operation of the aforementioned computer vision models.