As an excellent optimization algorithm widely used to solve various practical problems, differential evolution (DE) algorithm has few parameters, yet its performance is significantly affected by these parameters. To address this issue, this paper presents a novel variant of DE called DACDE, which utilizes data fusion-based parameter adaptation and a complementary mutation strategy. Most parameter adaptation methods based on successful history only analyze the mean of parameters and ignore their degree of dispersion. In DACDE, the successful parameter distribution is recorded and described by both the mean and variance. Data fusion is then used to combine records and generate an estimated distribution, which is applied to a Gaussian distribution to generate new parameters. Inspired by opposition-based learning, we introduce a complementary mutation strategy. This strategy employs a symmetric selection mechanism to adapt to the varying search abilities required by the algorithm at different stages. The new variant is verified on 32 single-objective functions from CEC 2011 and 2014 benchmark suites, and the results show that DACDE is competitive compared to other 29 evolutionary algorithms.