AbstractIn stochastic power systems, electric vehicle (EV) fast charging stations (FCS) are rapidly being installed, while adversely impacts the distribution network. Due to this, the improper offline charging control policies for EVs may increase the voltage fluctuation and instability. To analyse these aspects, this paper investigates the problems associated with offline (dis)charging control for effective utilization of battery storage and grid power through different modes of operations. Further, the need to develop real‐time charging control is identified to mitigate the adverse impacts of FCS on the distribution network. Hence, an online controller using reinforcement learning (RL) is designed to distinguish the uncertainties in real‐time and to schedule the (dis)charging of an EV against the uncertainties based on travelling pattern. The RL based online controller uses deep neural network (DNN), where the agents are programmed to control the bi‐directional power flow (V2G/G2V). The effectiveness of the RL rewards controller is fulfilled by the different charging states of the battery. The performance of online (dis)charging controllers that utilize DNN to act at its optimal power flow set of points for all sessions are examined in the details. Finally, the effectiveness of online RL controller and hardware results have been realized using real time hardware‐in‐the loop simulator.