Abstract
This paper presents improvements to two-stage algorithms for estimating the short-time Fourier transform (STFT) phase from only the amplitude by using deep neural networks (DNNs). The phase is difficult to reconstruct due to its sensitivity to the waveform shift and wrapping issue. To mitigate these problems, two-stage approaches indirectly estimate the phase through phase derivatives, i.e., instantaneous frequency (IF) and group delay (GD). In the first stage, the IF and GD are estimated from the amplitude using DNNs, and then in the second stage, the phase is reconstructed by maintaining the IF/GD information. Conventional methods for the second stage do not consider the importance of high-amplitude time–frequency bins, e.g., the least squares-based method, or lack a solid model, e.g., the average-based method. To address these problems, we propose improvements to the second stage of two-stage algorithms by using <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">von Mises</i> distribution-based maximum likelihood and weighted least squares. We also provide theoretical discussions for the phase reconstruction, including the investigations of the properties of the GD and roles of the IF/GD information in the inverse STFT. On the basis of the analysis, we propose a new phase-based feature, i.e., inter-frequency phase difference (IFPD), and demonstrate its application in two-stage phase reconstruction algorithms. We conducted subjective and objective experiments to compare the performances of our proposed and conventional methods. The results confirm that the proposed method using the IFPD performs better than other methods for all metrics.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE/ACM Transactions on Audio, Speech, and Language Processing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.