The transition to the edge-cloud era makes ultra-high data rate signals indispensable for covering the immense and increasing traffic demands created. This ecosystem also seeks for power efficient optical modules that will deliver this enormous data load. Optical communication systems and their continuous evolution have responded to the upward trend of capacity needs in all different network ecosystems, scaling from short-reach links serving data center, fiber-to-the-home and 5G/B5G services to metro and long-haul transoceanic cables. Kerr induced non-linearities at long haul and dispersion induced power fading at short reach remain intractable problems that vastly affect high symbol rate systems. So, they must be addressed in a power efficient way in order to cope with the ever-increasing traffic requirements. <p xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">In this paper, we review our recent work in machine learning and neuromorphic processing in the optical domain for the mitigation of transmission impairments at very high symbol rates. Post-detection techniques based on bidirectional recurrent neural networks for non-linearity compensation and neuromorphic recurrent optical spectrum slicers for power fading mitigation and self-coherent detection emerge as promising solutions for mid-term deployment in long-haul and short-reach communication systems respectively. The current work provides new results in both fields focusing on multi-channel detection in the coherent long-haul domain and on a cost/consumption/performance assessment of neuromorphic photonic processing based on recurrent spectrum slicing in comparison to the state-of-the-art. A thorough analysis of other state-of-the-art techniques in both domains is also provided revealing the merits and shortcomings of recurrent neural networks and neuromorphic photonic processing in high-speed optical communication systems.