A Network That Balances Accuracy and Efficiency for Lane Detection

Ce Zhang, Yuhui Han, Dan Wang, Wei Qiao, Yier Lin
{"title":"A Network That Balances Accuracy and Efficiency for Lane Detection","authors":"Ce Zhang, Yuhui Han, Dan Wang, Wei Qiao, Yier Lin","doi":"10.1155/2021/1099434","DOIUrl":null,"url":null,"abstract":"In the automatic lane-keeping system (ALKS), the vehicle must stably and accurately detect the boundary of its current lane for precise positioning. At present, the detection accuracy of the lane algorithm based on deep learning has a greater leap than that of the traditional algorithm, and it can achieve better recognition results for corners and occlusion situations. However, mainstream algorithms are difficult to balance between accuracy and efficiency. In response to this situation, we propose a single-step method that directly outputs lane shape model parameters. This method uses MobileNet v2 and spatial CNN (SCNN) to construct a network to quickly extract lane features and learn global context information. Then, through depth polynomial regression, a polynomial representing each lane mark in the image is output. Finally, the proposed method was verified in the TuSimple dataset. Compared with existing algorithms, it achieves a balance between accuracy and efficiency. Experiments show that the recognition accuracy and detection speed of our method in the same environment have reached the level of mainstream algorithms, and an effective balance has been achieved between the two.","PeriodicalId":18790,"journal":{"name":"Mob. Inf. Syst.","volume":"429 1","pages":"1099434:1-1099434:5"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mob. Inf. Syst.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1155/2021/1099434","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In the automatic lane-keeping system (ALKS), the vehicle must stably and accurately detect the boundary of its current lane for precise positioning. At present, the detection accuracy of the lane algorithm based on deep learning has a greater leap than that of the traditional algorithm, and it can achieve better recognition results for corners and occlusion situations. However, mainstream algorithms are difficult to balance between accuracy and efficiency. In response to this situation, we propose a single-step method that directly outputs lane shape model parameters. This method uses MobileNet v2 and spatial CNN (SCNN) to construct a network to quickly extract lane features and learn global context information. Then, through depth polynomial regression, a polynomial representing each lane mark in the image is output. Finally, the proposed method was verified in the TuSimple dataset. Compared with existing algorithms, it achieves a balance between accuracy and efficiency. Experiments show that the recognition accuracy and detection speed of our method in the same environment have reached the level of mainstream algorithms, and an effective balance has been achieved between the two.
一种兼顾车道检测精度和效率的网络
在自动车道保持系统(ALKS)中,车辆必须稳定准确地检测当前车道的边界,才能进行精确定位。目前,基于深度学习的车道算法的检测精度比传统算法有了更大的飞跃,对于弯角和遮挡情况可以达到更好的识别效果。然而,主流算法很难在精度和效率之间取得平衡。针对这种情况,我们提出了一种直接输出车道形状模型参数的单步方法。该方法利用MobileNet v2和空间CNN (SCNN)构建网络,快速提取车道特征,学习全局上下文信息。然后,通过深度多项式回归,输出表示图像中每个车道标记的多项式。最后,在TuSimple数据集上对该方法进行了验证。与现有算法相比,该算法在精度和效率之间取得了平衡。实验表明,我们的方法在相同环境下的识别精度和检测速度都达到了主流算法的水平,并在两者之间实现了有效的平衡。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信