Towards a fossil-free urban transport system: An intelligent cross-type transferable energy management framework based on deep transfer reinforcement learning
Introduction
- (1)Even the cutting-edge DRL algorithm such as SAC used for energy management still has inherent drawbacks including difficult hyperparameters tuning and unstable convergence, which is unfavorable for the improvement of energy efficiency. Moreover, current research lacks the study of the SAC algorithm, and only a few papers related to SAC-based EMSs for FCHEVs have been published. Therefore, the utility value of SAC needs further exploration, and the research on SAC-based EMSs awaits to be enriched.
- (2)The DTRL-based EMSs for NEVs at present are only developed for conventional engine-battery HEVs, and the research on transferability and reusability of EMSs for FCHEVs has not been conducted yet. Besides, current DTRL methods for energy management are all designed based on DDPG which is inferior to other cutting-edge DRL algorithms such as SAC. Therefore, superior DRL algorithms are needed to design more intelligent DTRL methods for the development of transferable EMSs for FCHEVs.
- (3)Current DTRL-based cross-type EMSs only transfer the DNNs' mature parameters to initialize new DNNs but ignore the transfer of experience replay buffers which contain abundant learned knowledge. This leads to empty experience replay buffers of new energy management tasks at first and thereby is unfavorable for the training efficiency improvement of the transferred EMSs.
- (4)Most DRL-based EMSs for FCHEVs are trained using standard driving cycles different from real driving data. This usually causes unsatisfactory performances, especially the adaptability to real-world speed profiles. Since the purpose of developing DRL-based EMSs is for online application and adaptability is the premise of online application, the adaptability of DRL-based EMSs needs to be especially verified.
- (1)An enhanced SAC algorithm combined SAC with the PER mechanism is innovatively formulated as a more intelligent DRL method to both accelerate the convergence speed and improve the learning ability of SAC.
- (2)A novel DTRL method is designed by integrating the enhanced SAC algorithm with TL, and then a cross-type transferable energy management framework is proposed based on the designed DTRL method to shorten the development cycle of DRL-based EMSs for different types of urban FCVs.
- (3)In contrast to previous research that only transfers the mature parameters of DNNs, this paper transfers not only the DNN parameters but also the PER buffer and the SumTree to fully reuse the learned knowledge.
- (4)Both the source domain and the target domain of the proposed transferable energy management framework are trained using massive real-world collected driving data in stochastic training environments, to obtain a robust representation model in the source domain and ensure the adaptability of the compensation model in the target domain.
- (5)The adaptability of the proposed DTRL-based EMS is especially verified under a synthetic driving cycle through online testing. The testing results indicate that the proposed EMS achieves 96.81% fuel economy of the global optimum with an impressive real-time performance for the online application.
Access through your organization
Check access to the full text by signing in through your organization.
Section snippets
Configuration and parameters
Preliminaries of transfer learning
Experimental setup for verification
Conclusion
- (1)An enhanced SAC algorithm is formulated by combining the standard SAC algorithm with the PER mechanism, which accelerates the convergence speed by
CRediT authorship contribution statement
Declaration of competing interest
Acknowledgments
References (50)
- et al.
The greenhouse gas emissions of an electrified vehicle combined with renewable fuels: life cycle assessment and policy implications[J]
Appl Energy
(2021) - et al.
Economic and environmental impacts of EVs promotion under the 2060 carbon neutrality target—a CGE based study in Shaanxi Province of China[J]
Appl Energy
(2023) - et al.
Energy management strategies for fuel cell hybrid electric vehicles: classification, comparison, and outlook[J]
Energ Conver Manage
(2022) - et al.
Hydrogen energy storage integrated battery and supercapacitor based hybrid power system: a statistical analysis towards future research directions[J]
Int J Hydrogen Energy
(2022) - et al.
Energy optimization of logistics transport vehicle driven by fuel cell hybrid power system[J]
Energ Conver Manage
(2019) - et al.
Predictive-ECMS based degradation protective control strategy for a fuel cell hybrid electric vehicle considering uphill condition[J]
ETransportation
(2022) - et al.
A novel hierarchical predictive energy management strategy for plug-in hybrid electric bus combined with deep deterministic policy gradient[J]
J Energy Storage
(2022) - et al.
Longevity-aware energy management for fuel cell hybrid electric bus based on a novel proximal policy optimization deep reinforcement learning framework[J]
J Power Sources
(2023) - et al.
Energy management for a power-split hybrid electric bus via deep reinforcement learning with terrain information[J]
Appl Energy
(2019) - et al.
Data-driven reinforcement-learning-based hierarchical energy management strategy for fuel cell/battery/ultracapacitor hybrid electric vehicles[J]
J Power Sources
(2020)
Longevity-conscious energy management strategy of fuel cell hybrid electric vehicle based on deep reinforcement learning[J]
Energy
Reinforcement learning-based energy management strategies of fuel cell hybrid vehicles with multi-objective control[J]
J Power Sources
Lifespan-consciousness and minimum-consumption coupled energy management strategy for fuel cell hybrid vehicles via deep reinforcement learning[J]
Int J Hydrogen Energy
Total travel costs minimization strategy of a dual-stack fuel cell logistics truck enhanced with artificial potential field and deep reinforcement learning[J]
Energy
A novel data-driven energy management strategy for fuel cell hybrid electric bus based on improved twin delayed deep deterministic policy gradient algorithm[J]
Int J Hydrogen Energy
Battery thermal-and cabin comfort-aware collaborative energy management for plug-in fuel cell electric vehicles based on the soft actor-critic algorithm[J]
Energ Conver Manage
Detection of tuberculosis from chest X-ray images: boosting the performance with vision transformer and transfer learning[J]
Exp Syst Appl
Online transfer learning strategy for enhancing the scalability and deployment of deep reinforcement learning control in smart buildings[J]
Appl Energy
A transferable energy management strategy for hybrid electric vehicles via dueling deep deterministic policy gradient[J]
Green Energy Intellig Transp
Research on hybrid ratio of fuel cell hybrid vehicle based on ADVISOR[J]
Int J Hydrogen Energy
Real-time cost-minimization power-allocating strategy via model predictive control for fuel cell hybrid electric vehicles[J]
Energ Conver Manage
Hierarchical predictive energy management of fuel cell buses with launch control integrating traffic information[J]
Energ Conver Manage
Transferable representation modelling for real-time energy management of the plug-in hybrid vehicle based on k-fold fuzzy learning and Gaussian process regression[J]
Appl Energy
Hierarchical predictive energy management of hybrid electric buses based on driver information[J]
J Clean Prod
Battery health-aware and naturalistic data-driven energy management for hybrid electric bus based on TD3 deep reinforcement learning algorithm[J]
Appl Energy
Cited by (19)
An enhanced salp swarm algorithm with chaotic mapping and dynamic learning for optimizing purge process of proton exchange membrane fuel cell systems
2024, EnergyCitation Excerpt :The research applying artificial intelligence methods to fuel cell systems primarily focuses on two aspects. On the one hand, researchers employ techniques such as deep reinforcement learning in the vehicle energy management system to optimize the power distribution of the fuel cell and the storage battery of the hybrid vehicles according to the driving conditions, thereby ensuring that the fuel cell operates in the high-efficiency zone and reduce the consumption of hydrogen [42–45]. On the other hand, researchers improve fuel cell system efficiency by optimizing the operational parameters of various internal modules within the fuel cell system.
A data-driven solution for intelligent power allocation of connected hybrid electric vehicles inspired by offline deep reinforcement learning in V2X scenario
2024, Applied EnergyCitation Excerpt :Recent research progress has witnessed the utilization of the DRL algorithms to enhance fuel economy and robustness. The typical DRL algorithms consist of deep Q-network (DQN), deep deterministic policy gradient (DDPG) [20], twin delayed deep deterministic policy gradient (TD3) [21,22], proximal policy optimization (PPO) [23,24], etc. The structures of the networks and principles are more complex with the modification of the algorithms [25].
Enabling cross-type full-knowledge transferable energy management for hybrid electric vehicles via deep transfer reinforcement learning
2024, EnergyCitation Excerpt :Unlike conventional experience replay, PER surpasses the simple storage of samples in a replay buffer by additionally preserving each sample's priority in a leaf node of a SumTree. In the process of PER-based sampling, samples with higher priorities are more likely to be selected, then allowing for efficient retrieval from the buffer [48]. However, according to the literature survey presented in Section 1, it is observed that existing research on TL-based EMSs for HEVs has been predominantly focused on the standard DRL algorithm.