One-dimensional time series classification has always been an important research direction, playing an irreplaceable role in various fields. With the rapid development of deep learning, most traditional time series classification methods have gradually been replaced by neural network-based classification methods. At present, in the field of time series classification, models that perform well are mostly based on Convolutional Neural Networks (CNNs), while models with the Transformer architecture, which have shown outstanding performance in natural language processing and computer vision, do not stand out. In order to change this situation, We have investigated the feasibility of training models based on a single Transformer architecture directly on small datasets without additional data processing. Furthermore, we propose a new model called CTCTime, which combines the Transformer architecture with CNNs to address one-dimensional time series classification problems. We compared CTCTime with 13 traditional algorithms on 44 datasets from the UCR archive and with 7 advanced methods on 85 datasets. The UCR datasets are classic datasets for one-dimensional time series classification tasks. The experimental results demonstrate its feasibility, accuracy, and scalability.