The variants of randomized Kaczmarz (RK) and randomized Gauss-Seidel (RGS) are distinct iterative algorithms for ridge regression. Theoretical convergence rates for these two algorithms were provided by a simple side-by-side analysis in recent work. Considering that greedy randomized Kaczmarz (GRK) algorithm converges faster than the RK algorithm in both theory and experiments, we extend the GRK algorithm for ordinary least squares (OLS) regression to ridge regression and establish the corresponding convergence theory for the resulting algorithm and, furthermore, we study the GRK algorithm with relaxation for ridge regression and prove that it also converges with expected exponential rate in this paper. In addition, we propose an accelerated GRK algorithm with relaxation for ridge regression by executing more rows that corresponding to the larger entries of the residual vector simultaneously at each iteration. Numerical results show that the GRK algorithm with and without relaxation for ridge regression perform much better in iteration steps than the variants of RK and RGS algorithms, and the accelerated GRK algorithm with relaxation significantly outperforms all other algorithms in terms of both iteration counts and computing times.