Abstract

Understanding how line drawings convey tri-dimensionality is of fundamental importance in explaining surface perception when photometry is either uninformative or too complex to model analytically. We put forward here a computational model for interpreting line drawings as three-dimensional surfaces, based on constraints on local surface orientation along extremal and discontinuity boundaries. Specific techniques are described for two key processes: recovering the three-dimensional conformation of a space curve (e.g., a surface boundary) from its two-dimensional projection in an image, and interpolating smooth surfaces from orientation constraints along extremal boundaries. The relevance of the model to a general theory of low-level vision is discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call