The mixture innovation (MI) model places a spike-and-slab mixture distribution for the innovations of time-varying regression coefficients and permits flexible time variation patterns while allowing for dynamic shrinkage. Despite its appeal, the standard Bayesian algorithm to block sample the vector of 0/1 mixture indicators at each time t needs to evaluate the model likelihood over all its 2K scenarios for a regression model with K regressors and becomes impractical when K grows. As an alternative, a new specification of the MI model is proposed in which the 0/1 mixture indicators in the original MI model are approximated by a logistic function of latent continuous variables. As such the model likelihood only needs to be evaluated twice in an Metropolis-Hastings step to block update the latent variables and hence the approximated mixture indicators at each time t, offering large improvement in computational efficiency while keeping the benefits of the MI model. An efficient MCMC algorithm is developed to estimate the new model. A simulation study shows that the new model can achieve the same level of estimation accuracy as the original MI model but at a much smaller computation cost. The new model is further tested in two empirical applications where block sampling the mixture indicators at each time t in the original MI model is practically infeasible.