We propose a test to distinguish a weakly-dependent time series with a trend component, from a long-memory process, possibly with a trend. The test uses a generalized likelihood ratio statistic based on wavelet domain likelihoods. The trend is assumed to be a polynomial whose order does not exceed a known value. The test is robust to trends which are piecewise polynomials. We study the empirical size and power by means of simulations and find that they are good and do not depend on specific choices of wavelet functions and models for the wavelet coefficients. The test is applied to annual minima of the Nile River and confirms the presence of long-range dependence in this time series.