To reduce computational complexity, macro-energy system models commonly implement reduced time-series data. For renewable energy systems dependent on seasonal storage and characterized by intermittent renewables, like wind and solar, adequacy of time-series reduction is in question. Using a capacity expansion model, we evaluate different methods for creating and implementing reduced time-series regarding loss of load and system costs. Results show that adequacy greatly depends on the length of the reduced time-series and how it is implemented into the model. Implementation as a chronological sequence with re-scaled time-steps prevents loss of load best but imposes a positive bias on seasonal storage resulting in an overestimation of system costs. Compared to chronological sequences, grouped periods require more time so solve for the same number of time-steps, because the approach requires additional variables and constraints. Overall, results suggest further efforts to improve time-series reduction and other methods for reducing computational complexity.