Within the framework of linear stability theory, a simple non-stationary boundary condition is developed to simulate the far-field asymptotic behavior of linear wave packets propagating in a compressible boundary layer. This condition allows us to skip the linear stage of instability evolution in direct numerical simulations (DNS) of laminar-turbulent transition. It is also suggested to perform numerical simulations of the linear stage without recalculating the Jacobi matrix in the Newton iteration procedure. Robustness and efficiency of the both approaches are evaluated by computations of the first-mode wave packet propagating in the boundary layer on a flat plate at freestream Mach number 2. DNS of the nonlinear stage show that the wave packet quickly breaks down into a turbulent spot, if its hump amplitude is umax′≥0.1U∞. This confirms an amplitude nature of the transition onset criterion. The spectral analysis indicates that the breakdown mechanism differs from the oblique, fundamental or subharmonic resonance and requires additional studies.