Many real-world time series exhibit both significant short- and long-range temporal correlations. Such correlations enhance the errors of linear trend analysis. In this paper, we provide a general framework for trend analysis under the consideration of such correlations. We propose a parsimonious model containing both a single short-range autoregressive parameter and long-range fractional parameter. We derive analytical closed-form results for the error bars of the least-squares estimate of the trend for such time series, highlighting the different effects of short- and of long-range correlations. We employ an ensemble method for the automated extraction of scaling regions to estimate the fractional parameter of the data model together with its error bar, and the Grünwald-Letnikov derivative for the identification of the autoregressive parameter. We apply this framework to the study of warming trends on gridded temperature data in central Europe. We make use of the redundancy of the trend signal in adjacent grid points using methods of spatial averaging and the first principal component of empirical orthogonal function analysis. We find good agreement between the results of these two methods. We find a statistically significant decadal warming trend in central Europe over the past 70 years, which shows a particularly dramatic increase over the past 20 years.