According to the general theory of magnetic resonance the integrated intensity of the resonance line depends on the total number of spins involved in this resonance and on temperature, where the temperature dependence of the line intensity should reflect the temperature changes of the static magnetic susceptibility. A decay of the resonance amplitude is thus usually caused by line broadening. In some cases, however, the line decays without a visible broadening and the temperature dependence of the line amplitude does not correlate with the temperature changes of the susceptibility [1, 2]. Such a behavior can originate from a strongly nonuniform distribution of the relaxation times responsible for line broadening. When the relaxation rates of localized spins, because of the variety of their local configurations, differ by orders of magnitude, then the resonances occurring at centers which relax very fast are so broad that they do not contribute to the observed line amplitude. In this paper we analyze the decay of the spectral line. We show that for a strongly nonuniform distribution of relaxation rates the line intensity, scaled as a product of the line amplitude and the square of the peak-peak linewidth, is determined by the integral over the rate distribution, beginning from the lowest rates and ending with the value corresponding to a native relaxation rate which originates from another relaxation mechanism. We analyze also the consequences of a nonuniform distribution of the longitudinal relaxation mechanisms, showing that mechanisms of such a type manifest themselves by a characteristic dependence of the line amplitude on the microwave