Graph Contrastive Learning (GCL), which learns node or graph representation from supervision signals derived from the graph data itself, has recently attracted extensive research attention and achieved great success. Remarkably, most of the existing GCL encoders essentially perform low-frequency filtering on graph, which however limits their expressive power on heterophilous graphs where dissimilar nodes tend to be connected. This raises an interesting question: can high frequency be informative for GCL? In this work, we experimentally study the influence of high-frequency signals on GCL and find that adding some high-frequency signals in contrasting is beneficial for improving GCL performance. That motivates us to design a more general GCL framework beyond low-pass filtering, which simultaneously performs low-pass and high-pass signal contrasts, so as to capture both low and high-frequency information in general graphs. Furthermore, to enable the representation learning to be aware of neighbor diversity in heterophilic graphs, we propose a novel graph contrastive loss, termed Adap-infoNCE, which can automatically decide the weights of negative samples based on feature representations of neighboring nodes. Here two types of neighbors are considered, i.e., spatial neighbors and featural neighbors, whose effectiveness is verified using empirical study on synthetic datasets. Extensive experiments demonstrate that our method brings significant and consistent improvements over the base GCL approach and exceeds multiple state-of-the-art results on several unsupervised benchmarks, even surpassing the performance of supervised benchmarks.