Efficient classification of remote sensing images using DF-DNLSTM: a deep feature densenet bidirectional long short term memory model

IF 1.6 Q2 ENGINEERING, MULTIDISCIPLINARY
Monika Kumari, Ajay Kaul
{"title":"Efficient classification of remote sensing images using DF-DNLSTM: a deep feature densenet bidirectional long short term memory model","authors":"Monika Kumari, Ajay Kaul","doi":"10.1007/s13198-024-02466-w","DOIUrl":null,"url":null,"abstract":"<p>Scene classification in remote sensing is challenging due to high inter-class similarity and low intra-class similarity. Numerous techniques have been introduced, but accurately classifying scenes remains arduous. To address this challenge, To address this, we propose a hybrid framework, DF-DNLSTM, integrating DenseNet-121 for feature extraction and BiLSTM for sequential modeling, enhancing accuracy and contextual understanding. Second, a Conditional Generative Adversarial Network (CGAN) is employed for data augmentation, improving training data quantity and quality. Finally, the study introduces SwarmHawk, a hybrid optimization algorithm that combines particle swarm optimization (PSO) and Harris hawk optimization (HHO). SwarmHawk ensures the selection of informative features while concurrently eliminating duplicates and redundancies. It also reduces computational time to 4863 s. The proposed DF-DNLSTM model is rigorously assessed on three public datasets-UCM, AID, and NWPU. Results demonstrate its superior efficacy, achieving 99.87% accuracy on UCM, equivalent accuracy on NWPU, and sustaining 98.57% accuracy on AID. This study establishes DF-DNLSTM’s effectiveness, highlighting its potential contributions to advancing remote sensing scene classification.</p>","PeriodicalId":14463,"journal":{"name":"International Journal of System Assurance Engineering and Management","volume":"42 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of System Assurance Engineering and Management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s13198-024-02466-w","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Scene classification in remote sensing is challenging due to high inter-class similarity and low intra-class similarity. Numerous techniques have been introduced, but accurately classifying scenes remains arduous. To address this challenge, To address this, we propose a hybrid framework, DF-DNLSTM, integrating DenseNet-121 for feature extraction and BiLSTM for sequential modeling, enhancing accuracy and contextual understanding. Second, a Conditional Generative Adversarial Network (CGAN) is employed for data augmentation, improving training data quantity and quality. Finally, the study introduces SwarmHawk, a hybrid optimization algorithm that combines particle swarm optimization (PSO) and Harris hawk optimization (HHO). SwarmHawk ensures the selection of informative features while concurrently eliminating duplicates and redundancies. It also reduces computational time to 4863 s. The proposed DF-DNLSTM model is rigorously assessed on three public datasets-UCM, AID, and NWPU. Results demonstrate its superior efficacy, achieving 99.87% accuracy on UCM, equivalent accuracy on NWPU, and sustaining 98.57% accuracy on AID. This study establishes DF-DNLSTM’s effectiveness, highlighting its potential contributions to advancing remote sensing scene classification.

Abstract Image

使用 DF-DNLSTM:深度特征双向长短期记忆模型对遥感图像进行高效分类
由于类间相似性高而类内相似性低,遥感中的场景分类具有挑战性。虽然已经引入了许多技术,但要对场景进行精确分类仍然十分困难。为了应对这一挑战,我们提出了一种混合框架 DF-DNLSTM,它整合了用于特征提取的 DenseNet-121 和用于序列建模的 BiLSTM,从而提高了准确性和上下文理解能力。其次,采用条件生成对抗网络(CGAN)进行数据扩增,提高了训练数据的数量和质量。最后,研究引入了一种混合优化算法 SwarmHawk,它结合了粒子群优化(PSO)和哈里斯鹰优化(HHO)。SwarmHawk 确保选择信息丰富的特征,同时消除重复和冗余。我们在三个公共数据集--UCM、AID 和 NWPU 上对所提出的 DF-DNLSTM 模型进行了严格评估。结果表明,该模型具有卓越的功效,在 UCM 上达到了 99.87% 的准确率,在 NWPU 上达到了同等准确率,在 AID 上保持了 98.57% 的准确率。这项研究证实了 DF-DNLSTM 的有效性,并强调了它在推进遥感场景分类方面的潜在贡献。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
4.30
自引率
10.00%
发文量
252
期刊介绍: This Journal is established with a view to cater to increased awareness for high quality research in the seamless integration of heterogeneous technologies to formulate bankable solutions to the emergent complex engineering problems. Assurance engineering could be thought of as relating to the provision of higher confidence in the reliable and secure implementation of a system’s critical characteristic features through the espousal of a holistic approach by using a wide variety of cross disciplinary tools and techniques. Successful realization of sustainable and dependable products, systems and services involves an extensive adoption of Reliability, Quality, Safety and Risk related procedures for achieving high assurancelevels of performance; also pivotal are the management issues related to risk and uncertainty that govern the practical constraints encountered in their deployment. It is our intention to provide a platform for the modeling and analysis of large engineering systems, among the other aforementioned allied goals of systems assurance engineering, leading to the enforcement of performance enhancement measures. Achieving a fine balance between theory and practice is the primary focus. The Journal only publishes high quality papers that have passed the rigorous peer review procedure of an archival scientific Journal. The aim is an increasing number of submissions, wide circulation and a high impact factor.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信