Over the past decade, diffusion LMS and real-time consensus (RTC) adaptive network algorithms have been widely used and studied. However, their performance analysis has relied on white regressor assumptions, which are not satisfied in practice. Here, we consider both temporally correlated regressors and noise and give mean squared deviation (MSD) and excess mean squared error (EMSE) performance results for the first time. Furthermore, we employ a novel second-order mixed time-scale stochastic averaging approach to study realization-wise fluctuations of the parameter estimation error. Critically, our new results show that both temporally correlated regressors and noise affect the performance of the algorithms, even for small gains. Simulations illustrate the results. This is Part II of an earlier work on stability. Together, these works show that the Adapt-Then-Combine (ATC), Combine-Then-Adapt (CTA), and RTC algorithms are strikingly similar in terms of stability and performance in the slow adaptation regime. In particular, ATC, CTA, and RTC have the same network MSD and EMSE to first order. Moreover, we conclude that existing MSD and EMSE results are insufficient to first order when both the regressors and noise are temporally correlated.