Seasonal variations in systemic immunity have been reported. This study aimed to evaluate whether seasonality affects the efficacy of anticancer immunotherapy. A total of 604 patients with lung cancer receiving single anti-programmed cell death (ligand) 1 (anti-PD-[L]1) inhibitors from two prospective observational cohorts were screened. Primary outcomes were progression-free survival (PFS) and overall survival (OS). Patients were classified into two groups according to the season when the treatment started: winter (November-February) and other seasons (March-October). Kaplan-Meier analysis and Cox proportional hazards models were fitted to evaluate the impact of seasonality on survival. For validation, propensity score matching was performed. A total of 484 patients with advanced non-small cell lung cancer were included. In an unmatched population, multivariable analysis demonstrated that the winter group (n=173) had a significantly lower risk of progression or death from immunotherapy than the other group (n=311) (PFS: hazard ratio [HR],0.77 [95% confidence interval (CI), 0.62-0.96]; p=.018; OS: HR,0.77 [95% CI, 0.1-0.98]; p=.032). In a propensity score-matched population, the winter group (n=162) showed significantly longer median PFS (2.8 months [95% CI, 1.9-4.1 months] vs. 2.0 months [95% CI, 1.4-2.7 months]; p=.009) than the other group (n=162). The winter group's median OS was also significantly longer than that of the other group (13.4 months [95% CI, 10.2-18.0 months] vs. 8.0 months [95% CI, 3.6-8.7 months]; p=.012). The trend toward longer survival in the winter group continued in subgroup analyses. Starting an anti-PD-(L)1 inhibitor in winter was associated with better treatment outcomes in patients with lung cancer compared to other seasons.