Abstract Different task-based and resting-state imaging datasets provide complementary information about the organization of the human brain. Brain parcellations based on single datasets will therefore be biased towards the particular type of information present in each dataset. To overcome this limitation, we propose here a hierarchical Bayesian framework that can learn a probabilistic brain parcellation across numerous task-based and resting-state datasets, exploiting their combined strengths. The framework is partitioned into a spatial arrangement model that defines the probability of each voxel belonging to a specific parcel (the probabilistic group atlas), and a set of dataset-specific emission models that define the probability of the observed data given the parcel of the voxel. Using the human cerebellum as an example, we show that the framework optimally combines information from different datasets to achieve a new population-based atlas that outperforms atlases based on single datasets. Furthermore, we demonstrate that using only 10 min of individual data, the framework is able to generate individual brain parcellations that outperform group atlases.
Read full abstract