Significant breakthroughs in the Internet of Things (IoT) and 5G technologies have driven several smart healthcare activities, leading to a flood of computationally intensive applications in smart healthcare networks. Mobile Edge Computing (MEC) is considered as an efficient solution to provide powerful computing capabilities to latency or energy sensitive nodes. The low-latency and high-reliability requirements of healthcare application services can be met through optimal offloading and resource allocation for the computational tasks of the nodes. In this study, we established a system model consisting of two types of nodes by considering nondivisible and trade-off computational tasks between latency and energy consumption. To minimize processing cost of the system tasks, a Mixed-Integer Nonlinear Programming (MINLP) task offloading problem is proposed. Furthermore, this problem is decomposed into task offloading decisions and resource allocation problems. The resource allocation problem is solved using traditional optimization algorithms, and the offloading decision problem is solved using a deep reinforcement learning algorithm. We propose an Online Offloading based on the Deep Reinforcement Learning (OO-DRL) algorithm with parallel deep neural networks and a weight-sensitive experience replay mechanism. Simulation results show that, compared with several existing methods, our proposed algorithm can perform real-time task offloading in a smart healthcare network in dynamically varying environments and reduce the system task processing cost.