ABSTRACT This paper focuses on the large-scale composite optimization problem, which is the sum of a finite number of smooth (possibly nonconvex) functions and a convex (possibly nonsmooth) function. A common method to solve this problem is proximal incremental aggregated gradient (PIAG) method (N. D. Vanli, M. Gurbuzbalaban and A. Ozdaglar, SIAM Journal on Optimization, 2018, 28(2): 1282–1300), which exploits the gradient information from previous iterations to approximate the gradient at the current point. However, as many first-order algorithms, it may suffer from slow convergence. To address this issue, this paper proposes an inexact variable metric variant of incremental aggregated Forward-Backward (iVMFB-IAG) method. This methods incorporates a variable metric strategy to accelerate the PIAG method and introduces an inexact criterion to enhance its practicality. Under the Kurdyka-Łojasiewicz (KL) property and the uniformly bounded positive definiteness of the scaling matrix, the sequence generated by the iVMFB-IAG method converges globally to the critical point of the problem. Additionally, the convergence rate is established under the KL framework. Moreover, under certain restrictive conditions, the method exhibits local superlinear convergence. Through the numerical experiments on image reconstruction applications, we demonstrate the effectiveness of the iVMFB-IAG method by restoring two distinct images.