版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Pusan Natl Univ Busan 46241 South Korea Penn State Univ University Pk PA 16802 USA
出 版 物:《IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY》 (IEEE运载工具技术汇刊)
年 卷 期:2021年第70卷第5期
页 面:4312-4323页
核心收录:
学科分类:0810[工学-信息与通信工程] 0808[工学-电气工程] 08[工学] 0823[工学-交通运输工程]
基 金:National Research Foundation of Korea - Ministry of Science and ICT [NRF-2019R1A2C1003103]
主 题:Batteries Cooling Thermal management Refrigerants Heating systems Energy consumption Electric vehicles Battery Thermal Management Model Predictive Control Dynamic Programming Stochastic Dynamic Programming Electric Vehicle
摘 要:The battery thermal management system of an electric vehicle consumes considerable energy when cooling the battery, which can reduce the driving range. To minimize the energy consumption of the battery cooling system, controllers need to be designed as an optimal control problem. A model predictive control can be applied to the optimal controller design, which can be implemented in real-time but at the cost of a small loss of optimality. The performance of a model predictive controller is affected by its cost structure, which is typically composed of the transition cost and the terminal cost. The transition cost is defined by the controller objective, and energy consumption is one example. However, the terminal cost is user defined and it is the main design factor for the controller performance. In model predictive control, the terminal cost is usually formulated to penalize the state variations, which can cause loss of optimality. In this study, the terminal cost is formulated to represent the cost from the end of the prediction horizon to infinity, which is called the cost-to-go. This approach is consistent at the point of an optimal control problem, and the controller with cost-to-go can achieve more optimal performance than one that penalizes state variations. In the proposed model predictive controller, the cost-to-go is approximated by the optimal expected cost that can be calculated using stochastic dynamic programming. The proposed controller reduces the energy consumption significantly in comparison to a typical model predictive controller without increasing the computing load.