Abstract
Patterns on real-world objects are often due to variations in geometry across the surface. Height fields and other common parametric methods cannot synthesize many forms of geometric surface texture such as thorns, scales, and bark. We present an example-based technique for synthesizing a variety of geometric textures on a model's surface. The applied textures can be from models specifically created for this purpose, or may be drawn from user-specified regions of an example model. We extend existing neighborhood-based texture synthesis algorithms to operate on volumetric models. Similar to image analogies [11], given a training pair of unfiltered and filtered source models and an unfiltered destination model (all volumetric grids), we synthesize a filtered fourth model that exhibits the desired geometric texture. The user defines vector fields to specify the direction of texture anisotropy on the source and destination models. The vector field defines a coordinate frame on the destination object's surface that is used to sample the voxel density values in the neighborhood near a given voxel, which then gives a feature vector that is matched to the neighborhoods in the source model. Destination voxels are visited in an order that is dictated by the vector field. We show geometric synthesis results on a variety of models using textures such as pits, grooves, thru-holes and thorns.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.