This brief presents a novel approach to predict the bounds of the time-domain response of a linear system subject to multiple bounded uncertain input parameters. The method leverages the framework of Taylor models in conjunction with the numerical inversion of Laplace transform (NILT). Different formulations of the NILT are reviewed, and their advantages and limitations are discussed. An implementation relying on an inverse fast Fourier transform turns out to be the most efficient and accurate alternative. The feasibility of the technique is validated based on several diverse application examples, namely a control loop, a lossy transmission-line network, and an active low-pass filter.