With the development of Convolutional Neural Networks (CNNs), there is a growing requirement for their deployment on edge devices. At the same time, Compute-In-Memory (CIM) technology has gained significant attention in edge CNN applications due to its ability to minimize data movement between memory and computing units. However, the deployment of complex deep neural network models on edge devices with restricted hardware resources continues to be challenged by a lack of adequate storage for intermediate layer data. In this article, we propose an optimized JPEG Lossless Compression (JPEG-LS) algorithm that implements serial context parameter updating alongside parallel encoding. This method is designed for the global prediction and efficient compression of intermediate data layers in neural networks employing CIM techniques. The results indicate average compression ratios of 6.44× for VGG16, 3.62× for ResNet34, 1.67× for MobileNetV2, and 2.31× for InceptionV3. Moreover, the implementation achieves a data throughput of 32 bits per cycle at 600 MHz on the TSMC 28 nm, with a hardware cost of 122 K Gate Count.