Abstract

Sawmills rely on locating wood pith at the end surface of a wood stem either to automate the process and maximize the yield or to perform wood quality assessment of wood planks. Previously, wood pith location was approximated by applying image processing algorithms along with some geometric relations between the annual rings within a cross-sectional image of wood and its pith. This produces failed detection results if the wood stems deviate from a circular shape. In this paper, we propose employing YOLO (you only look once) object detection to a wood pith identification problem. The Tiny-YOLO network with 23 layers was trained using a transfer learning approach with 345 wood cross-sectional images with pith annotations. Different experiments with variable ratios between the training and testing datasets and training iterations were carried out to produce checkpoints and test against the testing datasets. We found that the minimum loss function which is commonly used to indicate the best trained model in other deep neural networks (DNNs) object detection is not effective for this application. Therefore, we used detection accuracy, to reflect the percentage of detectable piths while inferencing the trained model with its testing dataset and an average distance error, i.e., the average distances between the detected piths and their ground truths, as the accuracy measurement. The experimental result shows that a ratio of training and testing datasets of 70:30, at training iterations of 170,000, gives rise to the maximum detection value of 76.3% with a 16.6-pixel average distance error. Additionally, it produces only 25% average distance error; $\frac{1034}{4176}$ with 50% standard deviation; $\frac{2533}{5395}$, compared to the state of the art non-DNN-based pith detection approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.