Example-based texture synthesis algorithms have gained widespread popularity for their ability to take a single input image and create a perceptually similar non-periodic texture. However, previous methods rely on single input exemplars that can capture only a limited band of spatial scales. For example, synthesizing a continent-like appearance at a variety of zoom levels would require an impractically high input resolution. In this paper, we develop a multiscale texture synthesis algorithm. We propose a novel example-based representation, which we call an exemplar graph, that simply requires a few low-resolution input exemplars at different scales. Moreover, by allowing loops in the graph, we can create infinite zooms and infinitely detailed textures that are impossible with current example-based methods. We also introduce a technique that ameliorates inconsistencies in the user's input, and show that the application of this method yields improved interscale coherence and higher visual quality. We demonstrate optimizations for both CPU and GPU implementations of our method, and use them to produce animations with zooming and panning at multiple scales, as well as static gigapixel-sized images with features spanning many spatial scales.