Let \(\Delta _0\) be the Laplace–Beltrami operator on the unit sphere \(\mathbb {S}^{d-1}\) of \({\mathbb R}^d\). We show that the Hardy–Rellich inequality of the form $$\begin{aligned} \mathop \int \limits _{\mathbb {S}^{d-1}} \left| f (x)\right| ^2 \mathrm{d}{\sigma }(x) \le c_d \min _{e\in \mathbb {S}^{d-1}} \mathop \int \limits _{\mathbb {S}^{d-1}} (1- {\langle }x, e {\rangle }) \left| (-\Delta _0)^{\frac{1}{2}}f(x) \right| ^2 \mathrm{d}{\sigma }(x) \end{aligned}$$holds for \(d =2\) and \(d \ge 4\) but does not hold for \(d=3\) with any finite constant, and the optimal constant for the inequality is \(c_d = 8/(d-3)^2\) for \(d =2, 4, 5,\) and, under additional restrictions on the function space, for \(d\ge 6\). This inequality yields an uncertainty principle of the form $$\begin{aligned} \min _{e\in \mathbb {S}^{d-1}} \mathop \int \limits _{\mathbb {S}^{d-1}} (1- {\langle }x, e {\rangle }) |f(x)|^2 \mathrm{d}{\sigma }(x) \mathop \int \limits _{\mathbb {S}^{d-1}}\left| \nabla _0 f(x)\right| ^2 \mathrm{d}{\sigma }(x) \ge c'_d \end{aligned}$$on the sphere for functions with zero mean and unit norm, which can be used to establish another uncertainty principle without zero mean assumption, both of which appear to be new.