Electromagnetic waves propagating in a layered superconductor with arbitrary momentum, with respect to the main crystallographic directions, exhibit an unavoidable mixing between longitudinal and transverse degrees of freedom. Here we show that this basic physical mechanism explains the emergence of a well-defined absorption peak in the in-plane optical conductivity when light propagates at small tilting angles relative to the stacking direction in layered cuprates. More specifically, we show that this peak, often interpreted as a spurious leakage of the c-axis Josephson plasmon, is instead a signature of the true longitudinal plasma mode occurring at larger momenta. By combining a classical approach based on Maxwell's equations with a full quantum derivation of the plasma modes based on modeling the superconducting phase degrees of freedom, we provide an analytical expression for the absorption peak as a function of the tilting angle and light polarization. We suggest that an all-optical measurement in tilted geometry can be used as an alternative way to access plasma-wave dispersion, usually measured by means of large-momenta scattering techniques like resonant inelastic X-ray scattering (RIXS) or electron energy loss spectroscopy (EELS).