This paper proposes a reliable integrated binary synapse and neurons for hardware implementation of binary neural networks. Thanks to the nonvolatile nature of the magnetic tunnel junction ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">MTJ</b> ) and the unique features of the carbon nanotube field-effect transistor ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">CNTFET</b> ), the proposed design in this paper does not require external memory to store weights and also consumes low static power. Also, due to the proposed circuit structure that did not use sequential parts, the proposed circuit is immune to soft error. Because in binary neural networks, weights are limited to two values of ‘-1’ and ‘1’, the occurrence of soft errors dramatically reduces the accuracy of the network. Simulation results indicate that the proposed design in this paper consumes at least 9% lower power, occupies 34% lower area, and offers a 49% lower power delay area product ( <bold xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">PDAP</b> ). Also, Monte-Carlo simulations have been performed to study the effect of the process variation on the network. The result of the Monte-Carlo simulations shows that the proposed neuron has no logical error in 10000 simulations. Consequently, the accuracy of the network utilizing by the proposed neuron is equal to the software-implemented network and does not decrease even in the presence of process variations.