Today, the field of practical estimation is dominated by linear-Bayesian estimators as these familiar techniques are computationally efficient and can be quickly tailored to many systems of interest. However, beyond the common weaknesses of linear estimators to non-Gaussian noises and nonlinear transformations, Bayes' rule itself lacks robustness, as it inherently assumes that all statistical models are exactly known. Therefore, in lieu of filters outfitted with linear-Bayesian updates, this work proposes a method to derive nonlinear, non-Bayesian updates from first principles using the theory of generalized variational inference (GVI)—an optimization approach to information fusion. By selecting different divergence measures, loss functions, and feasible distributions, a wide variety of robust/conservative filter updates are quickly built and tested using numerical optimization. From these GVI candidates, a suitable update that uses the robust γ-loss function is selected based upon its desirable estimation behavior. Via analytical optimization and Gaussian mixture modeling, the γ-loss GVI update is realized as a nonlinear, closed-form filter that is well-suited for practical estimation.