{"title":"基于方面的情感分析快速微调大型语言模型","authors":"Chaelyn Lee, Jaesung Lee","doi":"10.1049/ell2.70411","DOIUrl":null,"url":null,"abstract":"<p>The method proposed in this study aims to reduce the execution time required for fine-tuning large language models in aspect-based sentiment analysis. To achieve efficient fine-tuning, the large-language model parameter tuning for new data is accelerated through rank decomposition. Experiments on the SemEval datasets demonstrated that our method consistently outperformed strong baselines such as GPT-ABSA and BART-ABSA across multiple metrics including accuracy, F1-score, precision, and recall while also reducing fine-tuning time by approximately 35%. The experimental results demonstrate a notable decrease in execution time with the proposed approach of the fine-tuning process while preserving the accuracy of polarity prediction.</p>","PeriodicalId":11556,"journal":{"name":"Electronics Letters","volume":"61 1","pages":""},"PeriodicalIF":0.8000,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ell2.70411","citationCount":"0","resultStr":"{\"title\":\"Fast Fine-Tuning Large Language Models for Aspect-Based Sentiment Analysis\",\"authors\":\"Chaelyn Lee, Jaesung Lee\",\"doi\":\"10.1049/ell2.70411\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The method proposed in this study aims to reduce the execution time required for fine-tuning large language models in aspect-based sentiment analysis. To achieve efficient fine-tuning, the large-language model parameter tuning for new data is accelerated through rank decomposition. Experiments on the SemEval datasets demonstrated that our method consistently outperformed strong baselines such as GPT-ABSA and BART-ABSA across multiple metrics including accuracy, F1-score, precision, and recall while also reducing fine-tuning time by approximately 35%. The experimental results demonstrate a notable decrease in execution time with the proposed approach of the fine-tuning process while preserving the accuracy of polarity prediction.</p>\",\"PeriodicalId\":11556,\"journal\":{\"name\":\"Electronics Letters\",\"volume\":\"61 1\",\"pages\":\"\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2025-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/ell2.70411\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Electronics Letters\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/ell2.70411\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Electronics Letters","FirstCategoryId":"5","ListUrlMain":"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/ell2.70411","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
Fast Fine-Tuning Large Language Models for Aspect-Based Sentiment Analysis
The method proposed in this study aims to reduce the execution time required for fine-tuning large language models in aspect-based sentiment analysis. To achieve efficient fine-tuning, the large-language model parameter tuning for new data is accelerated through rank decomposition. Experiments on the SemEval datasets demonstrated that our method consistently outperformed strong baselines such as GPT-ABSA and BART-ABSA across multiple metrics including accuracy, F1-score, precision, and recall while also reducing fine-tuning time by approximately 35%. The experimental results demonstrate a notable decrease in execution time with the proposed approach of the fine-tuning process while preserving the accuracy of polarity prediction.
期刊介绍:
Electronics Letters is an internationally renowned peer-reviewed rapid-communication journal that publishes short original research papers every two weeks. Its broad and interdisciplinary scope covers the latest developments in all electronic engineering related fields including communication, biomedical, optical and device technologies. Electronics Letters also provides further insight into some of the latest developments through special features and interviews.
Scope
As a journal at the forefront of its field, Electronics Letters publishes papers covering all themes of electronic and electrical engineering. The major themes of the journal are listed below.
Antennas and Propagation
Biomedical and Bioinspired Technologies, Signal Processing and Applications
Control Engineering
Electromagnetism: Theory, Materials and Devices
Electronic Circuits and Systems
Image, Video and Vision Processing and Applications
Information, Computing and Communications
Instrumentation and Measurement
Microwave Technology
Optical Communications
Photonics and Opto-Electronics
Power Electronics, Energy and Sustainability
Radar, Sonar and Navigation
Semiconductor Technology
Signal Processing
MIMO