Imrus Salehin , Md. Shamiul Islam , Pritom Saha , S.M. Noman , Azra Tuni , Md. Mehedi Hasan , Md. Abu Baten
{"title":"AutoML: A systematic review on automated machine learning with neural architecture search","authors":"Imrus Salehin , Md. Shamiul Islam , Pritom Saha , S.M. Noman , Azra Tuni , Md. Mehedi Hasan , Md. Abu Baten","doi":"10.1016/j.jiixd.2023.10.002","DOIUrl":null,"url":null,"abstract":"<div><p>AutoML (Automated Machine Learning) is an emerging field that aims to automate the process of building machine learning models. AutoML emerged to increase productivity and efficiency by automating as much as possible the inefficient work that occurs while repeating this process whenever machine learning is applied. In particular, research has been conducted for a long time on technologies that can effectively develop high-quality models by minimizing the intervention of model developers in the process from data preprocessing to algorithm selection and tuning. In this semantic review research, we summarize the data processing requirements for AutoML approaches and provide a detailed explanation. We place greater emphasis on neural architecture search (NAS) as it currently represents a highly popular sub-topic within the field of AutoML. NAS methods use machine learning algorithms to search through a large space of possible architectures and find the one that performs best on a given task. We provide a summary of the performance achieved by representative NAS algorithms on the CIFAR-10, CIFAR-100, ImageNet and well-known benchmark datasets. Additionally, we delve into several noteworthy research directions in NAS methods including one/two-stage NAS, one-shot NAS and joint hyperparameter with architecture optimization. We discussed how the search space size and complexity in NAS can vary depending on the specific problem being addressed. To conclude, we examine several open problems (SOTA problems) within current AutoML methods that assure further investigation in future research.</p></div>","PeriodicalId":100790,"journal":{"name":"Journal of Information and Intelligence","volume":"2 1","pages":"Pages 52-81"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2949715923000604/pdfft?md5=a79f7fb3cdab55edd3b7838063f99f50&pid=1-s2.0-S2949715923000604-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Information and Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2949715923000604","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
AutoML (Automated Machine Learning) is an emerging field that aims to automate the process of building machine learning models. AutoML emerged to increase productivity and efficiency by automating as much as possible the inefficient work that occurs while repeating this process whenever machine learning is applied. In particular, research has been conducted for a long time on technologies that can effectively develop high-quality models by minimizing the intervention of model developers in the process from data preprocessing to algorithm selection and tuning. In this semantic review research, we summarize the data processing requirements for AutoML approaches and provide a detailed explanation. We place greater emphasis on neural architecture search (NAS) as it currently represents a highly popular sub-topic within the field of AutoML. NAS methods use machine learning algorithms to search through a large space of possible architectures and find the one that performs best on a given task. We provide a summary of the performance achieved by representative NAS algorithms on the CIFAR-10, CIFAR-100, ImageNet and well-known benchmark datasets. Additionally, we delve into several noteworthy research directions in NAS methods including one/two-stage NAS, one-shot NAS and joint hyperparameter with architecture optimization. We discussed how the search space size and complexity in NAS can vary depending on the specific problem being addressed. To conclude, we examine several open problems (SOTA problems) within current AutoML methods that assure further investigation in future research.