Distributed Energy Resources (DERs) integration into distribution systems becomes problematic when the penetration level exceeds the system DER hosting capacity. Although several studies propose to quantify DER hosting capacity, methodologies to maximize it in distribution systems are scarce. Some approaches propose installing var compensators, incurring in costly solutions; others propose reduced cost solutions like network reconfiguration or optimized Volt–Var curve. Nevertheless, the combination of these approaches has not been proposed. Therefore, this paper proposes the simultaneous network reconfiguration and optimized setting of grid-tie inverter Volt–Var curve to maximize DER hosting capacity and minimize power losses. The uncertainties are addressed using Monte Carlo simulation, whereas Non-Dominated Sorting Genetic Algorithm II handles the multi-objective problem. The simulations are performed using Matlab and OpenDSS software, considering the IEEE 33-bus test system. Finally, the results demonstrate the superiority of the proposed approach against previous studies that performed network reconfiguration and Volt–Var optimization separately.