We present an innovative conceptual framework and a comprehensive mathematical model to advance the understanding and mitigation of self-interference phenomena within generalized frequency division multiplexing (GFDM). By introducing a novel analytical perspective, we decompose the self-interference effects inherent to GFDM into two orthogonal constituents through a vectorized representation. Our elucidation of the self-interference components in terms of prototype filter parameters in the frequency domain is of particular significance. This theoretical characterization allows us to derive explicit analytical expressions, thereby paving the way for the proposition of an optimal filter design strategy that effectively mitigates self-interference distortions within GFDM systems. Our investigation reveals a noteworthy linkage between the required bandwidth allocation for individual subcarriers and the sub-symbol configuration within the proposed optimal prototype filter. This relationship underscores the filter’s adeptness in optimizing spectrum utilization across the system. Through an analytical examination of the bit error rate (BER) performance within the GFDM framework, we establish the superior efficacy of our proposed optimal filter design relative to contemporary approaches documented in extant literature. Validation of our analytical findings is conducted via meticulous computer simulations, where a strong concurrence between the analytical predictions and the observed simulation outcomes is manifest.
Read full abstract