Many problems in time-dependent metrology can be phrased mathematically as a deconvolution problem. In such a problem, measured data is modeled as the convolution of a known system response function with an unknown source signal. The goal of deconvolution is to estimate the unknown source signal given knowledge about the system response function. A well-studied method for calculating this estimate is Tikhonov regularized deconvolution which attempts to balance the average difference between the estimated solution and true source signal with the variance in the estimated solution. In this article we study this so-called bias-variance tradeoff in the context of estimating a source measured by a high speed oscilloscope. By assuming we have bounds on the true source’s Fourier coefficients and a structural model for the uncertainties in the system response function, we derive pointwise-in-time confidence intervals on the true signal based on the estimated signal. We demonstrate the new technique with simulations relevant to the high speed measurement context.