Qiujing Xu , Peiming Guo , Fei Li , Meishan Zhang , Donghong Ji
{"title":"Improving LLM-based opinion expression identification with dependency syntax","authors":"Qiujing Xu , Peiming Guo , Fei Li , Meishan Zhang , Donghong Ji","doi":"10.1016/j.patrec.2025.07.012","DOIUrl":null,"url":null,"abstract":"<div><div>Opinion expression identification (OEI), a crucial task in fine-grained opinion mining, has received long-term attention for several decades. Recently, large language models (LLMs) have demonstrated substantial potential on the task. However, structural-aware syntax features, which have proven highly effective for encoder-based OEI models, remain challenging to be explored under the LLM paradigm. In this work, we introduce a novel approach that successfully enhances LLM-based OEI with the aid of dependency syntax. We start with a well-formed prompt learning framework for OEI, and then enrich the prompting text with syntax information from an off-the-shelf dependency parser. To mitigate the negative impact of irrelevant dependency structures, we employ a BERT-based CRF model as a retriever to select only salient dependencies. Experiments on three benchmark datasets covering English, Chinese and Portuguese indicate that our method is highly effective, resulting in significant improvements on all datasets. We also provide detailed analysis to understand our method in-depth.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"197 ","pages":"Pages 81-87"},"PeriodicalIF":3.3000,"publicationDate":"2025-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167865525002648","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Opinion expression identification (OEI), a crucial task in fine-grained opinion mining, has received long-term attention for several decades. Recently, large language models (LLMs) have demonstrated substantial potential on the task. However, structural-aware syntax features, which have proven highly effective for encoder-based OEI models, remain challenging to be explored under the LLM paradigm. In this work, we introduce a novel approach that successfully enhances LLM-based OEI with the aid of dependency syntax. We start with a well-formed prompt learning framework for OEI, and then enrich the prompting text with syntax information from an off-the-shelf dependency parser. To mitigate the negative impact of irrelevant dependency structures, we employ a BERT-based CRF model as a retriever to select only salient dependencies. Experiments on three benchmark datasets covering English, Chinese and Portuguese indicate that our method is highly effective, resulting in significant improvements on all datasets. We also provide detailed analysis to understand our method in-depth.
期刊介绍:
Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition.
Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.