Abstract

BackgroundHigh quality 3D information of the microscopic plant tissue morphology—the spatial organization of cells and intercellular spaces in tissues—helps in understanding physiological processes in a wide variety of plants and tissues. X-ray micro-CT is a valuable tool that is becoming increasingly available in plant research to obtain 3D microstructural information of the intercellular pore space and individual pore sizes and shapes of tissues. However, individual cell morphology is difficult to retrieve from micro-CT as cells cannot be segmented properly due to negligible density differences at cell-to-cell interfaces. To address this, deep learning-based models were trained and tested to segment individual cells using X-ray micro-CT images of parenchyma tissue samples from apple and pear fruit with different cell and porosity characteristics.ResultsThe best segmentation model achieved an Aggregated Jaccard Index (AJI) of 0.86 and 0.73 for apple and pear tissue, respectively, which is an improvement over the current benchmark method that achieved AJIs of 0.73 and 0.67. Furthermore, the neural network was able to detect other plant tissue structures such as vascular bundles and stone cell clusters (brachysclereids), of which the latter were shown to strongly influence the spatial organization of pear cells. Based on the AJIs, apple tissue was found to be easier to segment, as the porosity and specific surface area of the pore space are higher and lower, respectively, compared to pear tissue. Moreover, samples with lower pore network connectivity, proved very difficult to segment.ConclusionsThe proposed method can be used to automatically quantify 3D cell morphology of plant tissue from micro-CT instead of opting for laborious manual annotations or less accurate segmentation approaches. In case fruit tissue porosity or pore network connectivity is too low or the specific surface area of the pore space too high, native X-ray micro-CT is unable to provide proper marker points of cell outlines, and one should rely on more elaborate contrast-enhancing scan protocols.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.