{"title":"Comparative analysis of manual and programmed annotations for crowd assessment and classification using artificial intelligence","authors":"","doi":"10.1016/j.dsm.2024.04.001","DOIUrl":null,"url":null,"abstract":"<div><div>Funding agencies play a pivotal role in bolstering research endeavors by allocating financial resources for data collection and analysis. However, the lack of detailed information regarding the methods employed for data gathering and analysis can obstruct the replication and utilization of the results, ultimately affecting the study’s transparency and integrity. The task of manually annotating extensive datasets demands considerable labor and financial investment, especially when it entails engaging specialized individuals. In our crowd counting study, we employed the web-based annotation tool SuperAnnotate to streamline the human annotation process for a dataset comprising 3,000 images. By integrating automated annotation tools, we realized substantial time efficiencies, as demonstrated by the remarkable achievement of 858,958 annotations. This underscores the significant contribution of such technologies to the efficiency of the annotation process.</div></div>","PeriodicalId":100353,"journal":{"name":"Data Science and Management","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Data Science and Management","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666764924000250","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Funding agencies play a pivotal role in bolstering research endeavors by allocating financial resources for data collection and analysis. However, the lack of detailed information regarding the methods employed for data gathering and analysis can obstruct the replication and utilization of the results, ultimately affecting the study’s transparency and integrity. The task of manually annotating extensive datasets demands considerable labor and financial investment, especially when it entails engaging specialized individuals. In our crowd counting study, we employed the web-based annotation tool SuperAnnotate to streamline the human annotation process for a dataset comprising 3,000 images. By integrating automated annotation tools, we realized substantial time efficiencies, as demonstrated by the remarkable achievement of 858,958 annotations. This underscores the significant contribution of such technologies to the efficiency of the annotation process.