Information processing in biological systems is realized by the appropriate transmission of information flows over complex networks, such as gene regulatory, signal transduction, and neural networks. These information flows are affected by the input-signal characteristics and structural properties of network systems, such as the network topology, regulation rules, and intrinsic and environmental noise. Many biological networks frequently include several typical patterns called network motifs, which are considered to play important roles in biological functions. However, their information-theoretic properties, particularly the dependence of the information flows in each network on the input signal, remain poorly understood. In our previous study [Mori and Okada, Phys. Rev. Res. 2, 043432 (2020)], we developed a graphical expansion method to describe transfer entropy (TE), a measure of information flow, in Boolean networks in terms of multiple information pathways. There, the input signal was limited to a simple case, and the effect of the input-signal characteristics on TE was not clarified. In this paper, we improve our method to render it applicable to Boolean networks that receive input signals with arbitrary stochastic characteristics. Our formula expresses how TE is determined by the input-signal characteristics, the assignment of Boolean functions, and the noise magnitude. We find that, in both positive and negative feedback loops, TE hardly depends on the signal timescale. In contrast, coherent and incoherent feedforward loops show low- and high-pass filtering properties, respectively, for a time-varying signal, which is consistent with previous reports. The emergence of either low- or high-pass filtering is determined by the Fourier components of the Boolean functions on specific pathways transmitting information flows. Thus our formula reveals the mechanism of information transfer in network motifs and provides insights into the origin of information processing in biological networks.