Phase contrast MRI (PC-MRI) is used clinically to measure velocities in the body, but systematic background phase errors caused by magnetic field imperfections corrupt the velocity measurements with offsets that limit clinical utility. This work aims to minimize systematic background phase errors in PC-MRI, thereby maximizing the accuracy of velocity measurements. The MRI scanner's background phase errors from eddy currents and mechanical oscillations were modeled using the gradient impulse response function (GIRF). Gradient waveforms were then numerically optimized using the GIRF to create pulse sequences that minimize the background phase errors. The pulse sequences were tested in a static phantom where the predicted response could be compared directly to the measured background velocity. The optimized acquisitions were then tested in human subjects, where flow rates and background errors were compared to conventional PC-MRI. When using the GIRF-optimized gradient waveforms, the predicted background phase was within 0.6 [95% CI = -3.4, 5.4] mm/s of the measured background phase in a static phantom. Excellent agreement was seen for in vivo blood flow values (flow rate agreement = 0.96), and the background phase was reduced by 78.8 18.7%. This work shows that using a GIRF to model the effects of magnetic field imperfections combined with numerically optimized gradient waveforms enables PC-MRI waveforms to be designed to produce a minimal background phase in the most time-efficient manner. The methodology could be adapted for other MRI sequences where similar magnetic field errors affect measurements.