Abstract

Tree-ring analysis is widely used in many different fields of science. Among others, information carried by annual tree rings allows determining the rates of environmental changes and the timing of events. The analysis of the tree rings requires prior detection of tree-ring boundaries which is traditionally performed manually with the use of stereoscope, a moving table, and a data recorder. This is, however, time-consuming and very cumbersome, especially in the case of long tree-ring series. Several approaches to an automatic detection of tree-ring boundaries exist; however, they use basic image processing techniques. As a result, their accuracy is limited, and their application is restricted mainly to conifer wood where the tree-ring boundaries are clearly defined. There also exists some commercial software, however, none of them is perfect as they fail when applied to the ring-porous wood type. Therefore this paper proposes a DeepDendro approach i.e., an automatic tree-ring boundary detector built upon the U-Net convolutional network. To the authors’ best knowledge this is the first study which applies ConvNets for an automatic detection of tree rings. The performance of the existing approach was tested on the dataset of images of wood cores of three species that represent the ring-porous type of the anatomical structure (Quercus sp., Fraxinus excelsior L., and Ulmus sp.). The testing dataset contained over 2500 of tree-ring boundaries, 96% of which were determined correctly by the proposed method. The corresponding precision is at the level of 0.97 which confirms that only a few false boundaries were introduced by the DeepDendro approach. The results were obtained automatically without any user interaction.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.