State-space models (SSMs) are commonly used to model time series data where the observations depend on an unobserved latent process. However, inference on the model parameters of an SSM can be challenging, especially when the likelihood of the data given the parameters is not available in closed-form. One approach is to jointly sample the latent states and model parameters via Markov chain Monte Carlo (MCMC) and/or sequential Monte Carlo approximation. These methods can be inefficient, mixing poorly when there are many highly correlated latent states or parameters, or when there is a high rate of sample impoverishment in the sequential Monte Carlo approximations. We propose a novel block proposal distribution for Metropolis-within-Gibbs sampling on the joint latent state and parameter space. The proposal distribution is informed by a deterministic hidden Markov model (HMM), defined such that the usual theoretical guarantees of MCMC algorithms apply. We discuss how the HMMs are constructed, the generality of the approach arising from the tuning parameters, and how these tuning parameters can be chosen efficiently in practice. We demonstrate that the proposed algorithm using HMM approximations provides an efficient alternative method for fitting state-space models, even for those that exhibit near-chaotic behavior.