Point cloud representation for real-world objects has seen a surge in interest recently, finding widespread applications in augmented reality, virtual reality, and autonomous driving. However, the process of acquiring, compressing, transmitting, and processing point cloud data often introduces unwanted distortions, highlighting the need for effective no-reference quality assessment methods for distorted point clouds. In this paper, we propose a Wavelet Point Cloud Transformer (Wave-PCT) method to assess the quality of point clouds without reference data. We start by performing patch sampling on a source point cloud, generating an extensive set of patches from the point cloud. These point cloud patches and their 2D projection images are processed using wavelets to produce multiscale local spectral features; moreover, we also extract global features of these patches using Pointnet++. Finally, a self-attention module is performed to fuse these local and global features, resulting in a distortion-sensitive quality score metric. We evaluate the Wave-PCT method on several point cloud quality assessment datasets using various metrics, and the experimental results indicate that the proposed framework exhibits outstanding and reliable performance when compared to various point cloud quality assessment methods.