The evolution of galaxies depends on their masses and local environments; understanding when and how environmental quenching starts to operate remains a challenge. Furthermore, studies of the high-redshift regime have been limited to massive cluster members, owing to sensitivity limits or small fields of view when the sensitivity is sufficient, intrinsically biasing the picture of cluster evolution. In this work, we use stacking to investigate the average star formation history of more than 10,000 groups and clusters drawn from the Massive and Distant Clusters of WISE Survey 2. Our analysis covers near-ultraviolet to far-infrared wavelengths, for galaxy overdensities at 0.5 ≲ z ≲ 2.54. We employ spectral energy distribution fitting to measure the specific star formation rates (sSFRs) in four annular apertures with radii between 0 and 1000 kpc. At z ≳ 1.6, the average sSFR evolves similarly to the field in both the core and the cluster outskirts. Between z¯=1.60 and z¯=1.35 , the sSFR in the core drops sharply, and it continues to fall relative to the field sSFR at lower redshifts. We interpret this change as evidence that the impact of environmental quenching dramatically increases at z ∼ 1.5, with the short time span of the transition suggesting that the environmental quenching mechanism dominant at this redshift operates on a rapid timescale. We find indications that the sSFR may decrease with increasing host halo mass, but lower-scatter mass tracers than the signal-to-noise ratio are needed to confirm this relationship.