Md. Moynul Asik Moni, Maharshi Niloy, Aquibul Haq Chowdhury, Farah Jasmin Khan, Md. Fahmid-Ul-Alam Juboraj, Amitabha Chakrabarty
{"title":"基于AI模型的进程调度算法比较分析","authors":"Md. Moynul Asik Moni, Maharshi Niloy, Aquibul Haq Chowdhury, Farah Jasmin Khan, Md. Fahmid-Ul-Alam Juboraj, Amitabha Chakrabarty","doi":"10.1109/ICCIT57492.2022.10055395","DOIUrl":null,"url":null,"abstract":"Process scheduling is an integral part of operating systems. The most widely used scheduling algorithm in operating systems is Round Robin, but the average waiting time in RR is often quite long. The purpose of this study is to propose a new algorithm to minimize waiting time and process starvation by determining the optimal time quantum by predicting CPU burst time. For burst time prediction, we are using the machine learning algorithms like linear regression, decision tree, k-nearest neighbors, and Neural Network Model Multi-Layer Perceptron. Moreover, for 10000 predicted burst time of processes with the same configuration, we have compared the average turnaround time, the average waiting time and the number of context switches of the proposed modified round robin algorithm with Traditional Round Robin, Modified Round Robin, Optimized Round Robin and Self-Adjustment Round Robin. The proposed modified round robin i.e. Absolute Difference Based Time Quantum Round Robin (ADRR) is found to be almost 2 times faster than the other algorithm in terms of process scheduling for the used dataset which contains a huge load of processes.","PeriodicalId":255498,"journal":{"name":"2022 25th International Conference on Computer and Information Technology (ICCIT)","volume":"86 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Comparative Analysis of Process Scheduling Algorithm using AI models\",\"authors\":\"Md. Moynul Asik Moni, Maharshi Niloy, Aquibul Haq Chowdhury, Farah Jasmin Khan, Md. Fahmid-Ul-Alam Juboraj, Amitabha Chakrabarty\",\"doi\":\"10.1109/ICCIT57492.2022.10055395\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Process scheduling is an integral part of operating systems. The most widely used scheduling algorithm in operating systems is Round Robin, but the average waiting time in RR is often quite long. The purpose of this study is to propose a new algorithm to minimize waiting time and process starvation by determining the optimal time quantum by predicting CPU burst time. For burst time prediction, we are using the machine learning algorithms like linear regression, decision tree, k-nearest neighbors, and Neural Network Model Multi-Layer Perceptron. Moreover, for 10000 predicted burst time of processes with the same configuration, we have compared the average turnaround time, the average waiting time and the number of context switches of the proposed modified round robin algorithm with Traditional Round Robin, Modified Round Robin, Optimized Round Robin and Self-Adjustment Round Robin. The proposed modified round robin i.e. Absolute Difference Based Time Quantum Round Robin (ADRR) is found to be almost 2 times faster than the other algorithm in terms of process scheduling for the used dataset which contains a huge load of processes.\",\"PeriodicalId\":255498,\"journal\":{\"name\":\"2022 25th International Conference on Computer and Information Technology (ICCIT)\",\"volume\":\"86 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 25th International Conference on Computer and Information Technology (ICCIT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCIT57492.2022.10055395\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 25th International Conference on Computer and Information Technology (ICCIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCIT57492.2022.10055395","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Comparative Analysis of Process Scheduling Algorithm using AI models
Process scheduling is an integral part of operating systems. The most widely used scheduling algorithm in operating systems is Round Robin, but the average waiting time in RR is often quite long. The purpose of this study is to propose a new algorithm to minimize waiting time and process starvation by determining the optimal time quantum by predicting CPU burst time. For burst time prediction, we are using the machine learning algorithms like linear regression, decision tree, k-nearest neighbors, and Neural Network Model Multi-Layer Perceptron. Moreover, for 10000 predicted burst time of processes with the same configuration, we have compared the average turnaround time, the average waiting time and the number of context switches of the proposed modified round robin algorithm with Traditional Round Robin, Modified Round Robin, Optimized Round Robin and Self-Adjustment Round Robin. The proposed modified round robin i.e. Absolute Difference Based Time Quantum Round Robin (ADRR) is found to be almost 2 times faster than the other algorithm in terms of process scheduling for the used dataset which contains a huge load of processes.