Abstract

Three-dimensional (3D) modeling of non-linear objects from stylized sketches is a challenge even for computer graphics experts. The extrapolation of object parameters from a stylized sketch is a very complex and cumbersome task. In the present study, we propose a broker system that can transform a stylized sketch of a tree into a complete 3D model by mediating between a modeler and a 3D modeling software. The input sketches do not need to be accurate or detailed: They must only contain a rudimentary outline of the tree that the modeler wishes to 3D model. Our approach is based on a well-defined Deep Neural Network architecture, called TreeSketchNet (TSN), based on convolutions and capable of generating Weber and Penn [1995] parameters from a simple sketch of a tree. These parameters are then interpreted by the modeling software, which generates the 3D model of the tree pictured in the sketch. The training dataset consists of synthetically generated sketches that are associated with Weber–Penn parameters, generated by a dedicated Blender modeling software add-on. The accuracy of the proposed method is demonstrated by testing the TSN with synthetic and hand-made sketches. Finally, we provide a qualitative analysis of our results, by evaluating the coherence of the predicted parameters with several distinguishing features.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call