Let us consider the partial linear regression model <$>y_i=\\bizeta_i^T \\bibeta + m\\lpar t_i\\rpar + \\varepsilon_i\\ \\lpar i=1\\comma\\! \\ldots\\comma \\,n\\rpar <$>, where the <$>\\lpar\\, p\ imes 1\\rpar <$>-vector <$>\\bibeta<$> and the function m are unknown, <$>\\bizeta_i<$> and <$>t_i<$> are design points (random and fixed, respectively) and the random errors <$>\\lcub \\varepsilon_i\\rcub <$> are dependent. We consider here the problem of testing: (a) <$>H_{0\\bibeta}\\colon\\ \\bibeta = \\bibeta_0<$>, (b) <$>H_{0m}\\colon\\ m=m_0<$> and (c) <$>H_{0m}^l\\colon\\ m\\in \\hbox{span}\\lcub\\, f_1\\comma\\! \\ldots\\comma \\,f_l\\rcub <$> (where <$>f_j\\ \\lpar\\, j=1\\comma\\! \\ldots\\comma \\,l\\rpar <$> are linearly independent functions), by using distances based on kernel estimators. In the case (a) we assume a strong mixing condition on the errors, while in the cases (b) and (c) we consider a MA(<$>\\infty<$>) structure on the errors. The asymptotic distributions of the corresponding test statistics are obtained under the null hypothesis and under local alternatives. Furthermore, in the case (a) and under fixed alternatives, the convergence to <$>\\infty<$> of the statistic is obtained.