Abstract
A few methods and tools are available for the quantitative measurement of the brain volume targeting mainly brain volume loss. However, several factors, such as the clinical conditions, the time of the day, the type of MRI machine, the brain volume artifacts, the pseudoatrophy, and the variations among the protocols, produce extreme variations leading to misdiagnosis of brain atrophy. While brain white matter loss is a characteristic lesion during neurodegeneration, the main objective of this study was to create a computational tool for high precision measuring structural brain changes using the fractal dimension (FD) definition. The validation of the BrainFD software is based on T1-weighted MRI images from the Open Access Series of Imaging Studies (OASIS)-3 brain database, where each participant has multiple MRI scan sessions. The software is based on the Python and JAVA programming languages with the main functionality of the FD calculation using the box-counting algorithm, for different subjects on the same brain regions, with high accuracy and resolution, offering the ability to compare brain data regions from different subjects and on multiple sessions, creating different imaging profiles based on the Clinical Dementia Rating (CDR) scores of the participants. Two experiments were executed. The first was a cross-sectional study where the data were separated into two CDR classes. In the second experiment, a model on multiple heterogeneous data was trained, and the FD calculation for each participant of the OASIS-3 database through multiple sessions was evaluated. The results suggest that the FD variation efficiently describes the structural complexity of the brain and the related cognitive decline. Additionally, the FD efficiently discriminates the two classes achieving 100% accuracy. It is shown that this classification outperforms the currently existing methods in terms of accuracy and the size of the dataset. Therefore, the FD calculation for identifying intracranial brain volume loss could be applied as a potential low-cost personalized imaging biomarker. Furthermore, the possibilities measuring different brain areas and subregions could give robust evidence of the slightest variations to imaging data obtained from repetitive measurements to Physicians and Radiologists.
Highlights
The quantitative measurement of the human brain volumes using segmentation software is highly correlated with the monitoring of neurodegeneration disorders (Jack et al, 2000; Kovacevic et al, 2009; Alexiou et al, 2017, 2019, 2020; Mantzavinos and Alexiou, 2017; Chatzichronis et al, 2019)
While a reliable brain volume decline can be characterized as unbiased if and only if the loss is large enough (Narayanan et al, 2020), and by taking into consideration that brain lesions and brain atrophy associated with mild cognitive impairment (MCI) are higher than the expected decline per year in non-demented older adults
Recent clinical studies suggest that the cortical functional connectivity networks show fractal properties and that any fluctuations to the fractal dimension (FD) of the brain gray and white matter are highly correlated with cognitive decline (Ha et al, 2005; Im et al, 2006; Li et al, 2007; King et al, 2009; Mustafa et al, 2012; Varley et al, 2020)
Summary
The quantitative measurement of the human brain volumes using segmentation software is highly correlated with the monitoring of neurodegeneration disorders (Jack et al, 2000; Kovacevic et al, 2009; Alexiou et al, 2017, 2019, 2020; Mantzavinos and Alexiou, 2017; Chatzichronis et al, 2019). Biological structures similar to the brain gray matter usually have rough surfaces and are characterized by heterogenicity and self-similar structures. This mathematical self-similarity is the repetitive display of the whole structure after lowering the scaling. If a fractal shape has a dimension of 2.3, it is more simplified than a 3D cube but more complicated than a 2D square. This non-Euclidean approach can be applied to minimize errors in the visualization of brain gray matter. The abovementioned procedure is the traditional method where no lattices between squares or overlaps exist (Yadav and Nishikanta, 2010)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.