Bayesian optimization (BO) is a global optimization algorithm well-suited for multimodal functions that are costly to evaluate, e.g. quantities derived from computationally expensive simulations. Recent advances have made it possible to scale BO to high-dimensional functions and accelerate its convergence by incorporating derivative information. These developments have laid the groundwork for a productive interplay between BO and adjoint solvers, a tool to cheaply obtain gradients of objective functions w.r.t. tunable parameters in a simulated physical system. In thermoacoustics, adjoint-based optimization has previously been applied to Helmholtz solvers and low-order network models to find optimally stable combustor configurations. These studies have used conjugate gradient or quasi-Newton optimizers which can get stuck in local optima and may require many evaluations of the underlying model to find a good optimum. In this paper, we propose using gradient-augmented BO to optimize adjoint models. We consider two test cases from the thermoacoustics literature: optimizing design parameters in a 1D adjoint Helmholtz model of a Rijke tube and geometry optimization in a low-order network model of a longitudinal combustor. We show that compared to BFGS, a standard quasi-Newton method, our gradient-enhanced BO arrives at multiple, more optimal configurations using considerably fewer evaluations of the solver. This approach holds great promise for efficient thermoacoustic stabilization when designing using expensive 3D adjoint Helmholtz solvers.