Night-Time Traffic Light Recognition Based on Enhancement-Guided Object Detection

IF 8.6 1区 计算机科学 Q1 AUTOMATION & CONTROL SYSTEMS
Zikai Yao;Qiang Liu;Zhangzhen Zhao;Yuliang Qin;Jinglong Zhu;Tianzhi Xia;Bo Li;Lipo Wang
{"title":"Night-Time Traffic Light Recognition Based on Enhancement-Guided Object Detection","authors":"Zikai Yao;Qiang Liu;Zhangzhen Zhao;Yuliang Qin;Jinglong Zhu;Tianzhi Xia;Bo Li;Lipo Wang","doi":"10.1109/TSMC.2025.3552621","DOIUrl":null,"url":null,"abstract":"Traffic light recognition is crucial for autonomous driving. While significant progress has been made in favorable conditions, recognition performance in night-time scenes remains a challenge. One straightforward approach is to apply enhancement methods that improve degraded images prior to object detection. However, since most enhancement methods are tailored for human perception; they may not consistently improve recognition accuracy for machine learning techniques. To address this, we propose an enhancement-guided framework for night-time traffic light recognition, called EG-TLR. EG-TLR consists of a residual denoising module (RDM) and a mixed attention traffic light detection module (MATLDM). The RDM reduces noise in degraded night-time images while preserving essential traffic light features by extracting sparsity information and performing context aggregation. The MATLDM improves feature extraction and recognition performance in complex night-time scenes by incorporating a shadow detection layer (SDL) and a mixed attention module (MAM). Moreover, to address the lack of a dedicated night-time traffic light dataset, we construct the Night-TL dataset utilizing publicly available images. Extensive experiments on Night-TL and LISA datasets demonstrate that EG-TLR achieves an AP50 of 79.83% and an AP50:95 of 36.62%, with an inference speed of 5.9 ms and 16.9 GFLOPs, outperforming other state-of-the-art methods. Furthermore, ablation studies and visualization results validate the effectiveness of our proposed method. The Night-TL dataset can be downloaded from: <uri>https://github.com/feiqinaqian/Night-TL-dataset</uri>.","PeriodicalId":48915,"journal":{"name":"IEEE Transactions on Systems Man Cybernetics-Systems","volume":"55 6","pages":"4410-4422"},"PeriodicalIF":8.6000,"publicationDate":"2025-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Systems Man Cybernetics-Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10947628/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Traffic light recognition is crucial for autonomous driving. While significant progress has been made in favorable conditions, recognition performance in night-time scenes remains a challenge. One straightforward approach is to apply enhancement methods that improve degraded images prior to object detection. However, since most enhancement methods are tailored for human perception; they may not consistently improve recognition accuracy for machine learning techniques. To address this, we propose an enhancement-guided framework for night-time traffic light recognition, called EG-TLR. EG-TLR consists of a residual denoising module (RDM) and a mixed attention traffic light detection module (MATLDM). The RDM reduces noise in degraded night-time images while preserving essential traffic light features by extracting sparsity information and performing context aggregation. The MATLDM improves feature extraction and recognition performance in complex night-time scenes by incorporating a shadow detection layer (SDL) and a mixed attention module (MAM). Moreover, to address the lack of a dedicated night-time traffic light dataset, we construct the Night-TL dataset utilizing publicly available images. Extensive experiments on Night-TL and LISA datasets demonstrate that EG-TLR achieves an AP50 of 79.83% and an AP50:95 of 36.62%, with an inference speed of 5.9 ms and 16.9 GFLOPs, outperforming other state-of-the-art methods. Furthermore, ablation studies and visualization results validate the effectiveness of our proposed method. The Night-TL dataset can be downloaded from: https://github.com/feiqinaqian/Night-TL-dataset.
基于增强制导目标检测的夜间交通灯识别
交通灯识别对于自动驾驶至关重要。虽然在有利条件下取得了重大进展,但夜间场景的识别性能仍然是一个挑战。一种直接的方法是在目标检测之前应用增强方法来改善退化的图像。然而,由于大多数增强方法都是为人类感知量身定制的;它们可能无法持续提高机器学习技术的识别准确性。为了解决这个问题,我们提出了一个增强指导的夜间交通灯识别框架,称为EG-TLR。EG-TLR由残差去噪模块(RDM)和混合注意红绿灯检测模块(MATLDM)组成。RDM通过提取稀疏性信息和执行上下文聚合来减少夜间图像中的噪声,同时保留交通灯的基本特征。MATLDM通过结合阴影检测层(SDL)和混合注意模块(MAM)提高了复杂夜间场景的特征提取和识别性能。此外,为了解决缺乏专用的夜间交通灯数据集的问题,我们利用公开可用的图像构建了Night-TL数据集。在Night-TL和LISA数据集上的大量实验表明,egg - tlr的AP50为79.83%,AP50:95为36.62%,推理速度为5.9 ms, GFLOPs为16.9,优于其他最先进的方法。此外,消融研究和可视化结果验证了我们所提出方法的有效性。Night-TL数据集可以从https://github.com/feiqinaqian/Night-TL-dataset下载。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Systems Man Cybernetics-Systems
IEEE Transactions on Systems Man Cybernetics-Systems AUTOMATION & CONTROL SYSTEMS-COMPUTER SCIENCE, CYBERNETICS
CiteScore
18.50
自引率
11.50%
发文量
812
审稿时长
6 months
期刊介绍: The IEEE Transactions on Systems, Man, and Cybernetics: Systems encompasses the fields of systems engineering, covering issue formulation, analysis, and modeling throughout the systems engineering lifecycle phases. It addresses decision-making, issue interpretation, systems management, processes, and various methods such as optimization, modeling, and simulation in the development and deployment of large systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信