Expert Systems最新文献

筛选
英文 中文
Comparison of nature-inspired algorithms in finite element-based metaheuristic optimisation of laminated shells 基于有限元的层状壳体元启发式优化中的自然启发算法比较
IF 3 4区 计算机科学
Expert Systems Pub Date : 2024-05-14 DOI: 10.1111/exsy.13620
Subham Pal, Kanak Kalita, Salil Haldar
{"title":"Comparison of nature-inspired algorithms in finite element-based metaheuristic optimisation of laminated shells","authors":"Subham Pal,&nbsp;Kanak Kalita,&nbsp;Salil Haldar","doi":"10.1111/exsy.13620","DOIUrl":"10.1111/exsy.13620","url":null,"abstract":"<p>This work presents a unique technique for optimising composite laminates used as structural components, which is critical for situations where failure might result in disastrous effects. Unlike traditional surrogate-based optimisation approaches, this methodology combines the accurate modelling capabilities of finite element (FE) analysis with the iterative refining capacity of metaheuristic algorithms. By combining these two methodologies, our method intends to improve the design process of laminated shell structures, assuring robustness and dependability is crucial. Compared to existing benchmark solutions, the current FE shows a &lt;1% error for cylindrical and spherical shells. The prime objective of this study is to identify the optimum ply angles for attaining a high fundamental frequency. The problem is NP-hard because the possible ply angles span a wide range (±90°), making it difficult for optimisation algorithms to find a solution. Seven popular metaheuristic algorithms, namely the genetic algorithm (GA), the ant lion optimisation (ALO), the arithmetic optimisation algorithm (AOA), the dragonfly algorithm (DA), the grey wolf optimisation (GWO), the salp swarm optimisation (SSO), and the whale optimisation algorithm (WOA), are applied to and compared on a wide range of shell design problems. It assesses parameter sensitivity, discovering significant design elements that influence dynamic behaviour. Convergence studies demonstrate the superior performance of AOA, GWO, and WOA optimisers. Rigorous statistical comparisons assist practitioners in picking the best optimisation technique. FE-GWO, FE-DA, and FE-SSA methods surpass the other techniques as well as the layerwise optimisation strategy. The findings obtained, employing the GWO, DA, and SSA optimisers, demonstrate ~3% improvement over the existing literature. With respect to conventional layup designs (cross-ply and angle-ply), the current optimised designs are better by at least 0.43% and as much as 48.91%.</p>","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.0,"publicationDate":"2024-05-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140981570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Blockchain‐enabled decentralized service selection for QoS‐aware cloud manufacturing 区块链支持的去中心化服务选择,用于服务质量感知云制造
IF 3.3 4区 计算机科学
Expert Systems Pub Date : 2024-05-10 DOI: 10.1111/exsy.13602
Ke Meng, Zhiyong Wu, Muhammad Bilal, Xiaoyu Xia, Xiaolong Xu
{"title":"Blockchain‐enabled decentralized service selection for QoS‐aware cloud manufacturing","authors":"Ke Meng, Zhiyong Wu, Muhammad Bilal, Xiaoyu Xia, Xiaolong Xu","doi":"10.1111/exsy.13602","DOIUrl":"https://doi.org/10.1111/exsy.13602","url":null,"abstract":"In recent years, cloud manufacturing has brought both opportunities and challenges to the manufacturing industry. Cloud manufacturing enables global manufacturing resources to be unified and shared, thus breaking down geographical constraints to enhance the level and efficiency of manufacturing. However, with the explosive growth of manufacturing resources and user demands, traditional cloud manufacturing platforms will face problems of insufficient computility, lack of real‐time data and difficulties in securing user privacy during the service selection process. In this article, a blockchain‐based decentralized cloud manufacturing service selection method is proposed, where the computility resource is deployed in multiple distributed nodes rather than the traditional centralized cloud manufacturing platform to solve the problem of insufficient computility. The credibility of the users is evaluated based on their performance on the contract and the PBFT consensus algorithm is improved based on the credibility of the users. In addition, a tri‐chain blockchain data storage model is designed to ensure the security, real‐time and transparency of data in the cloud manufacturing service selection process. The experimental results show that the method both speeds up service selection process and improves the quality of service selection results, and achieves a significant increase in manufacturing efficiency.","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.3,"publicationDate":"2024-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140935262","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Enterprise violation risk deduction combining generative AI and event evolution graph 结合生成式人工智能和事件演化图的企业违规风险推断
IF 3.3 4区 计算机科学
Expert Systems Pub Date : 2024-05-09 DOI: 10.1111/exsy.13622
Chao Zhong, Pengjun Li, Jinlong Wang, Xiaoyun Xiong, Zhihan Lv, Xiaochen Zhou, Qixin Zhao
{"title":"Enterprise violation risk deduction combining generative AI and event evolution graph","authors":"Chao Zhong, Pengjun Li, Jinlong Wang, Xiaoyun Xiong, Zhihan Lv, Xiaochen Zhou, Qixin Zhao","doi":"10.1111/exsy.13622","DOIUrl":"https://doi.org/10.1111/exsy.13622","url":null,"abstract":"In the current realms of scientific research and commercial applications, the risk inference of regulatory violations by publicly listed enterprises has attracted considerable attention. However, there are some problems in the existing research on the deduction and prediction of violation risk of listed enterprises, such as the lack of analysis of the causal logic association between violation events, the low interpretability and effectiveness of the deduction and the lack of training data. To solve these problems, we propose a framework for enterprise violation risk deduction based on generative AI and event evolution graphs. First, the generative AI technology was used to generate a new text summary of the lengthy and complex enterprise violation announcement to realize a concise overview of the violation matters. Second, by fine‐tuning the generative AI model, an event entity and causality extraction framework based on automated data augmentation are proposed, and the UIE (Unified Structure Generation for Universal Information Extraction) event entity extraction model is used to create the event entity extraction for listed enterprises ‘violations. Then, a causality extraction model CDDP‐GAT (Event Causality Extraction Based on Chinese Dictionary and Dependency Parsing of GAT) is proposed. This model aims to identify and analyse the causal links between corporate breaches, thereby deepening the understanding of the event logic. Then, the merger of similar events was realized, and the causal correlation weights between enterprise violation‐related events were evaluated. Finally, the listed enterprise's violation risk event evolution graph was constructed, and the enterprise violation risk deduction was carried out to form an expert system of financial violations. The deduction results show that the method can effectively reveal signs of enterprise violations and adverse consequences.","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.3,"publicationDate":"2024-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140931811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Lung cancer computed tomography image classification using Attention based Capsule Network with dispersed dynamic routing 利用基于注意力的胶囊网络和分散动态路由进行肺癌计算机断层扫描图像分类
IF 3 4区 计算机科学
Expert Systems Pub Date : 2024-05-08 DOI: 10.1111/exsy.13607
Ramya Paramasivam, Sujata N. Patil, Srinivas Konda, K. L. Hemalatha
{"title":"Lung cancer computed tomography image classification using Attention based Capsule Network with dispersed dynamic routing","authors":"Ramya Paramasivam,&nbsp;Sujata N. Patil,&nbsp;Srinivas Konda,&nbsp;K. L. Hemalatha","doi":"10.1111/exsy.13607","DOIUrl":"10.1111/exsy.13607","url":null,"abstract":"<p>Lung cancer is relying as one of the significant and leading cause for the deaths which are based on cancer. So, an effective diagnosis is a crucial step to save the patients who are all dying due to lung cancer. Moreover, the diagnosis must be performed based on the severity of lung cancer and the severity can be addressed with the help of an optimal classification approach. So, this research introduced an Attention based Capsule Network (A-Caps Net) with dispersed dynamic routing to perform in-depth classification of the disease affected partitions of the image and results in better classification results. The attention layer with dispersed dynamic routing evaluates the digit capsule from feature vector in a constant manner. As the first stage, data acquisitioned from datasets such as Lung Nodule Analysis-16 (LUNA-16), The Cancer Imaging Archive (TCIA) dataset and Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI). After acquisitioning data, pre-processing is done to enhance the resolution of the image using Generative Adversarial Network. The pre-processed output is given as output for extraction of features that takes place using GLCM and VGG-16 which extracts the low level features and high level features respectively. Finally, categorization of lung cancer is performed using Attention based Capsule Network (A-Caps Net) with dispersed dynamic routing which categorize the lung cancer as benign and malignant. The results obtained through experimental analysis exhibits that proposed approach attained better accuracy of 99.57%, 99.91% and 99.29% for LUNA-16, LIDC-IDRI and TCIA dataset respectively. The classification accuracy achieved by the proposed approach for LUNA-16 dataset is 99.57% which is comparably higher than DBN, 3D CNN, Squeeze Nodule Net and 3D-DCNN with multi-layered filter with accuracies of 99.16%, 97.17% and 94.1% respectively.</p>","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.0,"publicationDate":"2024-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140935286","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Doc-KG: Unstructured documents to knowledge graph construction, identification and validation with Wikidata Doc-KG:利用维基数据从非结构化文档到知识图谱的构建、识别和验证
IF 3 4区 计算机科学
Expert Systems Pub Date : 2024-05-08 DOI: 10.1111/exsy.13617
Muhammad Salman, Armin Haller, Sergio J. Rodríguez Méndez, Usman Naseem
{"title":"Doc-KG: Unstructured documents to knowledge graph construction, identification and validation with Wikidata","authors":"Muhammad Salman,&nbsp;Armin Haller,&nbsp;Sergio J. Rodríguez Méndez,&nbsp;Usman Naseem","doi":"10.1111/exsy.13617","DOIUrl":"10.1111/exsy.13617","url":null,"abstract":"<p>The exponential growth of textual data in the digital era underlines the pivotal role of Knowledge Graphs (KGs) in effectively storing, managing, and utilizing this vast reservoir of information. Despite the copious amounts of text available on the web, a significant portion remains unstructured, presenting a substantial barrier to the automatic construction and enrichment of KGs. To address this issue, we introduce an enhanced Doc-KG model, a sophisticated approach designed to transform unstructured documents into structured knowledge by generating local KGs and mapping these to a target KG, such as Wikidata. Our model innovatively leverages syntactic information to extract entities and predicates efficiently, integrating them into triples with improved accuracy. Furthermore, the Doc-KG model's performance surpasses existing methodologies by utilizing advanced algorithms for both the extraction of triples and their subsequent identification within Wikidata, employing Wikidata's Unified Resource Identifiers for precise mapping. This dual capability not only facilitates the construction of KGs directly from unstructured texts but also enhances the process of identifying triple mentions within Wikidata, marking a significant advancement in the domain. Our comprehensive evaluation, conducted using the renowned WebNLG benchmark dataset, reveals the Doc-KG model's superior performance in triple extraction tasks, achieving an unprecedented accuracy rate of 86.64%. In the domain of triple identification, the model demonstrated exceptional efficacy by mapping 61.35% of the local KG to Wikidata, thereby contributing 38.65% of novel information for KG enrichment. A qualitative analysis based on a manually annotated dataset further confirms the model's excellence, outshining baseline methods in extracting high-fidelity triples. This research embodies a novel contribution to the field of knowledge extraction and management, offering a robust framework for the semantic structuring of unstructured data and paving the way for the next generation of KGs.</p>","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.0,"publicationDate":"2024-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/exsy.13617","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140931734","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Nonlinear dynamical system approximation and adaptive control based on hybrid-feed-forward recurrent neural network: Simulation and stability analysis 基于混合前馈递归神经网络的非线性动力系统近似与自适应控制:仿真与稳定性分析
IF 3 4区 计算机科学
Expert Systems Pub Date : 2024-05-05 DOI: 10.1111/exsy.13619
R. Shobana, Rajesh Kumar, Bhavnesh Jaint
{"title":"Nonlinear dynamical system approximation and adaptive control based on hybrid-feed-forward recurrent neural network: Simulation and stability analysis","authors":"R. Shobana,&nbsp;Rajesh Kumar,&nbsp;Bhavnesh Jaint","doi":"10.1111/exsy.13619","DOIUrl":"10.1111/exsy.13619","url":null,"abstract":"<p>We proposed an online identification and adaptive control framework for the nonlinear dynamical systems using a novel hybrid-feed-forward recurrent neural network (HFRNN) model. The HFRNN is a combination of a feed-forward neural network (FFNN) and a local recurrent neural network (LRNN). We aim to leverage the simplicity of FFNN and the effectiveness of RNN to capture changing dynamics accurately and design an indirect adaptive control scheme. To derive the weights update equations, we have applied the gradient-descent-based Back-Propagation (BP) technique, and the stability of the proposed learning strategy is proven using the Lyapunov stability principles. We also compared the proposed method's results with those of the Jordan network-based controller (JNC) and the local recurrent network-based controller (LRNC) in the simulation examples. The results demonstrate that our approach performs satisfactorily, even in the presence of disturbance signals.</p>","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.0,"publicationDate":"2024-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140884928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Deep learning‐based aggregate analysis to identify cut‐off points for decision‐making in pancreatic cancer detection 基于深度学习的总体分析确定胰腺癌检测决策的临界点
IF 3.3 4区 计算机科学
Expert Systems Pub Date : 2024-04-26 DOI: 10.1111/exsy.13614
Gintautas Dzemyda, Olga Kurasova, Viktor Medvedev, Aušra Šubonienė, Aistė Gulla, Artūras Samuilis, Džiugas Jagminas, Kęstutis Strupas
{"title":"Deep learning‐based aggregate analysis to identify cut‐off points for decision‐making in pancreatic cancer detection","authors":"Gintautas Dzemyda, Olga Kurasova, Viktor Medvedev, Aušra Šubonienė, Aistė Gulla, Artūras Samuilis, Džiugas Jagminas, Kęstutis Strupas","doi":"10.1111/exsy.13614","DOIUrl":"https://doi.org/10.1111/exsy.13614","url":null,"abstract":"This study addresses the problem of detecting pancreatic cancer by classifying computed tomography (CT) images into cancerous and non‐cancerous classes using the proposed deep learning‐based aggregate analysis framework. The application of deep learning, as a branch of machine learning and artificial intelligence, to specific medical challenges can lead to the early detection of diseases, thus accelerating the process towards timely and effective intervention. The concept of classification is to reasonably select an optimal cut‐off point, which is used as a threshold for evaluating the model results. The choice of this point is key to ensure efficient evaluation of the classification results, which directly affects the diagnostic accuracy. A significant aspect of this research is the incorporation of private CT images from Vilnius University Hospital Santaros Klinikos, combined with publicly available data sets. To investigate the capabilities of the deep learning‐based framework and to maximize pancreatic cancer diagnostic performance, experimental studies were carried out combining data from different sources. Classification accuracy metrics such as the Youden index, (0, 1)‐criterion, Matthew's correlation coefficient, the F1 score, LR+, LR−, balanced accuracy, and g‐mean were used to find the optimal cut‐off point in order to balance sensitivity and specificity. By carefully analyzing and comparing the obtained results, we aim to develop a reliable system that will not only improve the accuracy of pancreatic cancer detection but also have wider application in the early diagnosis of other malignancies.","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.3,"publicationDate":"2024-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140797763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Prediction of Liaoning province steel import and export trade based on deep learning models 基于深度学习模型的辽宁省钢铁进出口贸易预测
IF 3.3 4区 计算机科学
Expert Systems Pub Date : 2024-04-26 DOI: 10.1111/exsy.13615
Limin Zhang
{"title":"Prediction of Liaoning province steel import and export trade based on deep learning models","authors":"Limin Zhang","doi":"10.1111/exsy.13615","DOIUrl":"https://doi.org/10.1111/exsy.13615","url":null,"abstract":"In the field of deep learning, time series forecasting, particularly for economic and trade data, is a critical area of research. This study introduces a hybrid of auto regressive integrated moving average and gated recurrent unit (ARIMA‐GRU) to enhance the prediction of steel import and export trade in Liaoning Province, addressing the limitations of traditional time series methods. Traditional models like ARIMA excel with linear data but often struggle with non‐linear patterns and long‐term dependencies. The ARIMA‐GRU model combines ARIMA's linear data analysis with GRU's proficiency in non‐linear pattern recognition, effectively capturing complex dynamics in economic datasets. Our experiments show that this hybrid approach surpasses traditional models in accuracy and reliability for forecasting steel trade, providing valuable insights for economic planning and strategic decision‐making. This innovative approach not only advances the field of economic forecasting but also demonstrates the potential of integrating deep learning techniques in complex data analysis.","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.3,"publicationDate":"2024-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140797825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Modelling of healthcare data analytics using optimal machine learning model in big data environment 在大数据环境中使用最佳机器学习模型建立医疗数据分析模型
IF 3.3 4区 计算机科学
Expert Systems Pub Date : 2024-04-26 DOI: 10.1111/exsy.13612
Chelladurai Fancy, Nagappan Krishnaraj, K. Ishwarya, G. Raja, Shyamala Chandrasekaran
{"title":"Modelling of healthcare data analytics using optimal machine learning model in big data environment","authors":"Chelladurai Fancy, Nagappan Krishnaraj, K. Ishwarya, G. Raja, Shyamala Chandrasekaran","doi":"10.1111/exsy.13612","DOIUrl":"https://doi.org/10.1111/exsy.13612","url":null,"abstract":"Recent advances in wireless networking, big data technologies, namely Internet of Things (IoT) 5G networks, health care big data analytics, and other technologies in artificial intelligence (AI) and wearables, have supported the progression of intellectual disease diagnosis methods. Medical data covers all patient data such as pharmacy texts, electronic health reports (EHR), prescriptions, study data from medical journals, clinical photographs, and diagnostic reports. Big data is a renowned method in the healthcare sector, with beneficial datasets that are highly difficult, voluminous, and rapid for healthcare providers for interpreting and computing using prevailing tools. This study combines concepts like deep learning (DL) and big data analytics in medical field. This article develops a new healthcare data analytics using optimal machine learning model in big data environment (HDAOML‐BDE) technique. The presented HDAOML‐BDE technique mainly aims to examine the healthcare data for disease detection and classification in the big data environment. For handling big data, the HDAOML‐BDE technique uses Hadoop MapReduce environment. In addition, the HDAOML‐BDE technique uses manta ray foraging optimization‐based feature selection (MRFO‐FS) technique to reduce high dimensionality problems. Moreover, the HDAOML‐BDE method uses relevance vector machine (RVM) model for the healthcare data environment. Furthermore, the arithmetic optimization algorithm (AOA) is utilized for the parameter tuning of the RVM classifier. The simulation results of the HDAOML‐BDE technique are tested on a healthcare dataset, and the outcomes portray the improved performance of the HDAOML‐BDE strategy over recent approaches in different measures.","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.3,"publicationDate":"2024-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140797827","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Optimization on selecting XGBoost hyperparameters using meta-learning 利用元学习优化 XGBoost 超参数的选择
IF 3 4区 计算机科学
Expert Systems Pub Date : 2024-04-25 DOI: 10.1111/exsy.13611
Tiago Lima Marinho, Diego Carvalho do Nascimento, Bruno Almeida Pimentel
{"title":"Optimization on selecting XGBoost hyperparameters using meta-learning","authors":"Tiago Lima Marinho,&nbsp;Diego Carvalho do Nascimento,&nbsp;Bruno Almeida Pimentel","doi":"10.1111/exsy.13611","DOIUrl":"10.1111/exsy.13611","url":null,"abstract":"<p>With computational evolution, there has been a growth in the number of machine learning algorithms and they became more complex and robust. A greater challenge is upon faster and more practical ways to find hyperparameters that will set up each algorithm individually. This article aims to use meta-learning as a practicable solution for recommending hyperparameters from similar datasets, through their meta-features structures, than to adopt the already trained XGBoost parameters for a new database. This reduced computational costs and also aimed to make real-time decision-making feasible or reduce any extra costs for companies for new information. The experimental results, adopting 198 data sets, attested to the success of the heuristics application using meta-learning to compare datasets structure analysis. Initially, a characterization of the datasets was performed by combining three groups of meta-features (general, statistical, and info-theory), so that there would be a way to compare the similarity between sets and, thus, apply meta-learning to recommend the hyperparameters. Later, the appropriate number of sets to characterize the XGBoost turning was tested. The obtained results were promising, showing an improved performance in the accuracy of the XGBoost, <i>k</i> = {4 − 6}, using the average of the hyperparameters values and, comparing to the standard grid-search hyperparameters set by default, it was obtained that, in 78.28% of the datasets, the meta-learning methodology performed better. This study, therefore, shows that the adoption of meta-learning is a competitive alternative to generalize the XGBoost model, expecting better statistics performance (accuracy etc.) rather than adjusting to a single/particular model.</p>","PeriodicalId":51053,"journal":{"name":"Expert Systems","volume":null,"pages":null},"PeriodicalIF":3.0,"publicationDate":"2024-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140797732","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信