In this work, for obtaining the inverses of time-variant complex matrices, four new kinds of recurrent neural network models [named modified finite-time convergent complex-valued zeroing neural network (MFTCVZNN) models] are put forward by constructing three different error functions. Besides, a high-performance finite-time activation function (HFTAF) is applied to the four MFTCVZNN models, which improves the comprehensive performance of the models. The analytical discussion indicates that the states of these MFTCVZNN models can tend to the time variant solutions in a limited time with the upper bound being analyzed, and the convergence efficiency is significantly improved by the application of the HFTAF. The simulation consequences validate that the analysis is correct and the MFTCVZNN models are effective for finding the inverses of time-variant complex matrices. Furthermore, comparative experiments are conducted for showing the superiority of the MFTCVZNN models in terms of convergence over the existing models.