High Peak-to-Average Power Ratio (PAPR) of the transmitted OFDM signal is a well-known major drawback of the Orthogonal Frequency-Division Multiplexing (OFDM), so the High-Power Amplifier (HPA) is therefore necessary to operate in its linear region, i.e. with large back-off between the operating input power and its saturation region, so introducing not only in-band distortion, but also the adjacent channel interference. Specifically with the Long-Term Evolution (LTE) systems downlink, some sort of PAPR reduction, such as e.g. clipping, must be utilized. Considering that in many practical situations, determining PAPR demands complex test equipment, such as e.g. Vector Signal Analyzer (VSA), which might not be available, in this paper, we develop a simple Bit-Error-Rate (BER) based model for the (residual) PAPR estimation, by applying link abstraction, i.e. considering the easy measurable BER degradation due to HPA non-linearity, as if it were the consequence of the according level of additive white Gaussian noise (AWGN) abstracting the HPA distortion, while considering high Signal-to-Noise Ratio (SNR) and long enough cyclic prefix (CP), thus neglecting (real) additive noise and time dispersion (i.e. multipath fading). Moreover, the out-of-service BER testing, which requires network operator to interrupt its revenue-generating traffic, can be substituted by in-service BER estimation from in-phase and quadrature-phase eye diagram closures, measured on live traffic, by means of a simple oscilloscope. The analytical model is verified by the appropriate Monte-Carlo simulations.