Abstract

Understanding how the human brain works has attracted increasing attention in both fields of neuroscience and machine learning. Previous studies use autoencoder and generative adversarial networks (GANs) to improve the quality of stimuli image reconstruction from functional magnetic resonance imaging (fMRI) data. However, these methods mainly focus on acquiring relevant features between two different modalities of data, i.e., stimuli images and fMRI, while ignoring the temporal information of fMRI data, thus leading to suboptimal performance. To address this issue, in this article, we propose a temporal information-guided GAN (TIGAN) to reconstruct visual stimuli from human brain activities. Specifically, the proposed method consists of three key components, including: 1) an image encoder for mapping the stimuli images into latent space; 2) a long short-term memory (LSTM) generator for fMRI feature mapping, which is used to capture temporal information in fMRI data; and 3) a discriminator for image reconstruction, which is used to make the reconstructed image more similar to the original image. In addition, to better measure the relationship of two different modalities of data (i.e., fMRI and natural images), we leverage a pairwise ranking loss to rank the stimuli images and fMRI to ensure strongly associated pairs at the top and weakly related ones at the bottom. The experimental results on real-world data sets suggest that the proposed TIGAN achieves better performance in comparison with several state-of-the-art image reconstruction approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.