优化浸没式膜生物反应器的神经网络算法:OVAT 和 RSM 超参数优化技术的比较研究

Syahira Ibrahim, Norhaliza Abdul Wahab
{"title":"优化浸没式膜生物反应器的神经网络算法:OVAT 和 RSM 超参数优化技术的比较研究","authors":"Syahira Ibrahim, Norhaliza Abdul Wahab","doi":"10.2166/wst.2024.099","DOIUrl":null,"url":null,"abstract":"\n Hyperparameter tuning is an important process to maximize the performance of any neural network model. This present study proposed the factorial design of experiment for screening and response surface methodology to optimize the hyperparameter of two artificial neural network algorithms. Feed-forward neural network (FFNN) and radial basis function neural network (RBFNN) are applied to predict the permeate flux of palm oil mill effluent. Permeate pump and transmembrane pressure of the submerge membrane bioreactor system are the input variables. Six hyperparameters of the FFNN model including four numerical factors (neuron numbers, learning rate, momentum, and epoch numbers) and two categorical factors (training and activation function) are used in hyperparameter optimization. RBFNN includes two numerical factors such as a number of neurons and spreads. The conventional method (one-variable-at-a-time) is compared in terms of optimization processing time and the accuracy of the model. The result indicates that the optimal hyperparameters obtained by the proposed approach produce good accuracy with a smaller generalization error. The simulation results show an improvement of more than 65% of training performance, with less repetition and processing time. This proposed methodology can be utilized for any type of neural network application to find the optimum levels of different parameters.","PeriodicalId":298320,"journal":{"name":"Water Science & Technology","volume":"10 5","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Optimizing neural network algorithms for submerged membrane bioreactor: A comparative study of OVAT and RSM hyperparameter optimization techniques\",\"authors\":\"Syahira Ibrahim, Norhaliza Abdul Wahab\",\"doi\":\"10.2166/wst.2024.099\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"\\n Hyperparameter tuning is an important process to maximize the performance of any neural network model. This present study proposed the factorial design of experiment for screening and response surface methodology to optimize the hyperparameter of two artificial neural network algorithms. Feed-forward neural network (FFNN) and radial basis function neural network (RBFNN) are applied to predict the permeate flux of palm oil mill effluent. Permeate pump and transmembrane pressure of the submerge membrane bioreactor system are the input variables. Six hyperparameters of the FFNN model including four numerical factors (neuron numbers, learning rate, momentum, and epoch numbers) and two categorical factors (training and activation function) are used in hyperparameter optimization. RBFNN includes two numerical factors such as a number of neurons and spreads. The conventional method (one-variable-at-a-time) is compared in terms of optimization processing time and the accuracy of the model. The result indicates that the optimal hyperparameters obtained by the proposed approach produce good accuracy with a smaller generalization error. The simulation results show an improvement of more than 65% of training performance, with less repetition and processing time. This proposed methodology can be utilized for any type of neural network application to find the optimum levels of different parameters.\",\"PeriodicalId\":298320,\"journal\":{\"name\":\"Water Science & Technology\",\"volume\":\"10 5\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-03-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Water Science & Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2166/wst.2024.099\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Water Science & Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2166/wst.2024.099","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

超参数调整是使任何神经网络模型性能最大化的重要过程。本研究提出了因子实验设计筛选和响应面方法,以优化两种人工神经网络算法的超参数。前馈神经网络(FFNN)和径向基函数神经网络(RBFNN)被用于预测棕榈油厂污水的渗透流量。渗透泵和浸没膜生物反应器系统的跨膜压力是输入变量。FFNN 模型的六个超参数包括四个数值因素(神经元数、学习率、动量和历元数)和两个分类因素(训练和激活函数),用于超参数优化。RBFNN 包括两个数值因素,如神经元数和扩散数。在优化处理时间和模型准确性方面,对传统方法(一次一个变量)进行了比较。结果表明,拟议方法获得的最优超参数具有较好的准确性和较小的泛化误差。模拟结果表明,在减少重复和处理时间的情况下,训练性能提高了 65% 以上。所提出的方法可用于任何类型的神经网络应用,以找到不同参数的最佳水平。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Optimizing neural network algorithms for submerged membrane bioreactor: A comparative study of OVAT and RSM hyperparameter optimization techniques
Hyperparameter tuning is an important process to maximize the performance of any neural network model. This present study proposed the factorial design of experiment for screening and response surface methodology to optimize the hyperparameter of two artificial neural network algorithms. Feed-forward neural network (FFNN) and radial basis function neural network (RBFNN) are applied to predict the permeate flux of palm oil mill effluent. Permeate pump and transmembrane pressure of the submerge membrane bioreactor system are the input variables. Six hyperparameters of the FFNN model including four numerical factors (neuron numbers, learning rate, momentum, and epoch numbers) and two categorical factors (training and activation function) are used in hyperparameter optimization. RBFNN includes two numerical factors such as a number of neurons and spreads. The conventional method (one-variable-at-a-time) is compared in terms of optimization processing time and the accuracy of the model. The result indicates that the optimal hyperparameters obtained by the proposed approach produce good accuracy with a smaller generalization error. The simulation results show an improvement of more than 65% of training performance, with less repetition and processing time. This proposed methodology can be utilized for any type of neural network application to find the optimum levels of different parameters.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信