This paper proposes a method for discriminating between normal apples and watercore apples using visible/near-infrared spectroscopy technology, which combines Gramian Angular Fields (GAF) encoding technology and the ConvNeXt deep learning network. The existing feature extraction methods for visible/near-infrared spectroscopy data do not perform deeper information mining on the extracted features, which results in the quality of the established model being entirely determined by the extracted features. Additionally, the process of building a visible/near-infrared spectroscopy data classification model is complex and time-consuming, and the accuracy of the established model is not high. To address these issues, the experimental visible/near-infrared spectroscopy data of apples was first transformed into two-dimensional images using Gramian Angular Summation Fields (GASF) and Gramian Angular Difference Fields (GADF) with sizes of 64, 128, 256, and 512. These images were then input into the ConvNeXt network, and the performance of different encoding methods and sizes was compared. The results showed that, under the conditions provided in this paper, the GADF encoding method with a size of 256 achieved the highest classification accuracy of 98.48%. Next, ResNet, EfficientNet, and RegNet deep learning networks were selected to classify the encoded images under the same conditions. The results above indicate that the apple variety discrimination method based on GAF encoding technology and ConvNeXt network combined with visible/near-infrared spectroscopy technology can achieve deep information mining of visible/near-infrared spectroscopy data and provide a relatively simple method for establishing qualitative classification models of visible/near-infrared spectroscopy. This method has a relatively excellent discrimination effect between normal apples and watercore apples.