AbstractWhen faced with multiple minima of an inner-level convex optimization problem, the convex bilevel optimization problem selects an optimal solution which also minimizes an auxiliary outer-level convex objective of interest. Bilevel optimization requires a different approach compared to single-level optimization problems since the set of minimizers for the inner-level objective is not given explicitly. In this paper, we propose a new projection-free conditional gradient method for convex bilevel optimization which requires only a linear optimization oracle over the base domain. We establish $$O(t^{-1/2})$$ O ( t - 1 / 2 ) convergence rate guarantees for our method in terms of both inner- and outer-level objectives, and demonstrate how additional assumptions such as quadratic growth and strong convexity result in accelerated rates of up to $$O(t^{-1})$$ O ( t - 1 ) and $$O(t^{-2/3})$$ O ( t - 2 / 3 ) for inner- and outer-levels respectively. Lastly, we conduct a numerical study to demonstrate the performance of our method.