In recent years, the accurate recognition of traffic scenes has played a key role in autonomous vehicle operations. However, most works in this area do not address the domain shift issue where the classification performance is degraded when the distribution of the source and target images are different due to weather changes. Also, lack of sparsity of current studies results in sample complexity and overfitting issues. To mitigate these challenges, this paper proposes a novel Sparse Adversarial Domain Adaptation (SADA) model for traffic scene classification. Our objective is to learn a scene classifier on a source domain with sunny weather and transfer its knowledge to a target domain with different weather (i.e., cloudy, rainy, and snowy) to enhance the classification performance on the target. First, a sparse representation is learned from the source traffic scenes via nonlinear dictionary learning in the latent space of a deep classifier. Then, a conditional generative adversarial network is devised to capture the distribution of the source sparse codes. Finally, a domain invariant sparse feature extractor is developed via a minimax game to align the sparse codes of the target domain images with the source; hence, providing domain adaptation for traffic scene classification using a deep neural network. Experimental results on a real-world dataset collected by the Honda Research Institute (HRI) indicate the superior performance of the proposed SADA compared to current traffic scene classification methods and state-of-the-art domain adaptation frameworks.