The proper power allocation between multiple energy sources is crucial for hybrid electric vehicles to guarantee energy economy. As a data-driven technique, offline deep reinforcement learning (DRL) solely exploits existing data to train energy management strategy (EMS), which becomes a promising solution for intelligent power allocation. However, current offline DRL-based strategies put high demands on the quality of datasets, and it is difficult to obtain numerous high-quality samples in practice. Thus, a bootstrapping error accumulation reduction (BEAR)-based strategy is proposed to enhance the energy-saving performance with different kinds of datasets. After that, based on the advanced V2X technology, a data-driven energy management updating framework is proposed to improve both fuel economy and adaptability of EMS via multi-updating. Specifically, the framework deploys multiple V2X-based buses to collect real-time information, and updates the strategy periodically making full use of offline data. The results show that the proposed BEAR-based EMS performs better than state-of-the-art offline EMSs in terms of fuel economy, especially realizing an improvement of 2.25% when training with mixed datasets. It is also validated that the offline EMS with the updating mechanism can reduce energy costs step by step under two different kinds of initial datasets.