{"title":"EKLI-Attention: An integrated attention mechanism for classifying citizen requests in government‒citizen interactions","authors":"Junpeng Zhang , Qian Geng , Jian Jin","doi":"10.1016/j.ipm.2025.104237","DOIUrl":null,"url":null,"abstract":"<div><div>Various regions in China mainland have implemented government‒citizen interaction boards on their government portals. Government staff assign citizen requests from these boards to departments for responses. With increasing request volume and departmental complexity, manual classification is excessively time-consuming and labor-intensive. The study of automatic classification for citizen requests has become more essential. Citizen requests contain governmental terms and limited text content, making them a typical example of short texts. In this study, an integrated attention mechanism model named EKLI-Attention (external knowledge and label information) is proposed to classify citizen requests by introducing external knowledge, such as relevant government matters and administrative region information, with labels corresponding to government departments. Particularly, a single-head cross-attention mechanism is designed to integrate text features with label information and generate an updated label feature representation, whereas a multihead self-attention mechanism is employed to integrate external knowledge to generate an updated text representation. Finally, multi-head cross-attention and two-stage convolution combine the updated label and text representations to generate the final classification. In the case study, two datasets containing over 84,000 citizen requests from Beijing and Shenzhen are investigated. The models are found to outperform the baseline models in various evaluation metrics, demonstrating their effectiveness and robustness. The application of Focal Loss improves the macro F1 score by 3.47 % and 4.04 % on the two datasets. It improves the efficiency of government agencies by ensuring that requests are routed to the correct departments efficiently. Moreover, it provides a valuable technical reference for short text classification.</div></div>","PeriodicalId":50365,"journal":{"name":"Information Processing & Management","volume":"62 6","pages":"Article 104237"},"PeriodicalIF":6.9000,"publicationDate":"2025-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing & Management","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306457325001785","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Various regions in China mainland have implemented government‒citizen interaction boards on their government portals. Government staff assign citizen requests from these boards to departments for responses. With increasing request volume and departmental complexity, manual classification is excessively time-consuming and labor-intensive. The study of automatic classification for citizen requests has become more essential. Citizen requests contain governmental terms and limited text content, making them a typical example of short texts. In this study, an integrated attention mechanism model named EKLI-Attention (external knowledge and label information) is proposed to classify citizen requests by introducing external knowledge, such as relevant government matters and administrative region information, with labels corresponding to government departments. Particularly, a single-head cross-attention mechanism is designed to integrate text features with label information and generate an updated label feature representation, whereas a multihead self-attention mechanism is employed to integrate external knowledge to generate an updated text representation. Finally, multi-head cross-attention and two-stage convolution combine the updated label and text representations to generate the final classification. In the case study, two datasets containing over 84,000 citizen requests from Beijing and Shenzhen are investigated. The models are found to outperform the baseline models in various evaluation metrics, demonstrating their effectiveness and robustness. The application of Focal Loss improves the macro F1 score by 3.47 % and 4.04 % on the two datasets. It improves the efficiency of government agencies by ensuring that requests are routed to the correct departments efficiently. Moreover, it provides a valuable technical reference for short text classification.
期刊介绍:
Information Processing and Management is dedicated to publishing cutting-edge original research at the convergence of computing and information science. Our scope encompasses theory, methods, and applications across various domains, including advertising, business, health, information science, information technology marketing, and social computing.
We aim to cater to the interests of both primary researchers and practitioners by offering an effective platform for the timely dissemination of advanced and topical issues in this interdisciplinary field. The journal places particular emphasis on original research articles, research survey articles, research method articles, and articles addressing critical applications of research. Join us in advancing knowledge and innovation at the intersection of computing and information science.