The results of a series of theoretical studies are reported, examining the convergence rate for different approximate representations of $\alpha $ -stable distributions. Although they play a key role in modelling random processes with jumps and discontinuities, the use of $\alpha $ -stable distributions in inference often leads to analytically intractable problems. The LePage series, which is a probabilistic representation employed in this work, is used to transform an intractable, infinite-dimensional inference problem into a finite-dimensional (conditionally Gaussian) parametric problem. A major component of our approach is the approximation of the tail of this series by a Gaussian random variable. Standard statistical techniques, such as Expectation-Maximization (EM), Markov chain Monte Carlo, and Particle Filtering, can then be readily applied. In addition to the asymptotic normality of the tail of this series, we establish explicit, nonasymptotic bounds on the approximation error. Their proofs follow classical Fourier-analytic arguments, using Esseen’s smoothing lemma. Specifically, we consider the distance between the distributions of: $(i)$ the tail of the series and an appropriate Gaussian; $(ii)$ the full series and the truncated series; and $(iii)$ the full series and the truncated series with an added Gaussian term. In all three cases, sharp bounds are established, and the theoretical results are compared with the actual distances (computed numerically) in specific examples of symmetric $\alpha $ -stable distributions. This analysis facilitates the selection of appropriate truncations in practice and offers theoretical guarantees for the accuracy of resulting estimates. One of the main conclusions obtained is that, for the purposes of inference, the use of a truncated series together with an approximately Gaussian error term has superior statistical properties and is likely a preferable choice in practice.