Deep reinforcement learning-based vehicle energy efficiency autonomous learning system

Xuewei Qi, Yadan Luo, Guoyuan Wu, K. Boriboonsomsin, M. Barth
{"title":"Deep reinforcement learning-based vehicle energy efficiency autonomous learning system","authors":"Xuewei Qi, Yadan Luo, Guoyuan Wu, K. Boriboonsomsin, M. Barth","doi":"10.1109/IVS.2017.7995880","DOIUrl":null,"url":null,"abstract":"To mitigate air pollution problems and reduce greenhouse gas emissions (GHG), plug-in hybrid electric vehicles (PHEV) have been developed to achieve higher fuel efficiency. The Energy Management System (EMS) is a very important component of a PHEV in achieving better fuel economy and it is a very active research area. So far, most of the existing EMS strategies just simple follow predefined rules that are not adaptive to changing driving conditions; other strategies as starting to incorporate accurate prediction of future traffic conditions. In this study, a deep reinforcement learning based PHEV energy management system is designed to autonomously learn the optimal fuel use from its own historical driving record. It is a fully data-driven and learning-enabled model that does not rely on any prediction or predefined rules. The experiment results show that the proposed model is able to achieve 16.3% energy savings comparing to conventional binary control strategies.","PeriodicalId":143367,"journal":{"name":"2017 IEEE Intelligent Vehicles Symposium (IV)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"49","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE Intelligent Vehicles Symposium (IV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IVS.2017.7995880","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 49

Abstract

To mitigate air pollution problems and reduce greenhouse gas emissions (GHG), plug-in hybrid electric vehicles (PHEV) have been developed to achieve higher fuel efficiency. The Energy Management System (EMS) is a very important component of a PHEV in achieving better fuel economy and it is a very active research area. So far, most of the existing EMS strategies just simple follow predefined rules that are not adaptive to changing driving conditions; other strategies as starting to incorporate accurate prediction of future traffic conditions. In this study, a deep reinforcement learning based PHEV energy management system is designed to autonomously learn the optimal fuel use from its own historical driving record. It is a fully data-driven and learning-enabled model that does not rely on any prediction or predefined rules. The experiment results show that the proposed model is able to achieve 16.3% energy savings comparing to conventional binary control strategies.
基于深度强化学习的车辆能效自主学习系统
为了缓解空气污染问题和减少温室气体排放(GHG),插电式混合动力汽车(PHEV)已经被开发出来,以实现更高的燃油效率。插电式混合动力汽车的能源管理系统(EMS)是实现插电式混合动力汽车燃油经济性的重要组成部分,也是目前研究的热点。到目前为止,大多数现有的EMS策略只是简单地遵循预定义的规则,不能适应不断变化的驾驶条件;其他策略也开始纳入对未来交通状况的准确预测。本研究设计了一种基于深度强化学习的插电式混合动力汽车能量管理系统,从其自身的历史驾驶记录中自主学习最佳燃料使用。它是一个完全数据驱动和支持学习的模型,不依赖于任何预测或预定义规则。实验结果表明,与传统的二值控制策略相比,该模型可实现16.3%的节能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信