Abstract

The decoupling theorem of quantum field theory is proved in Minkowski space. It states the vanishing property, in the distributional sense, of the renormalized Feynman amplitudes when any subset of the underlying masses are scaled to infinity. To prove the theorem we were able, in the process, to bound the corresponding integrals, in the ε → +0 limit, by similar Euclidean integrals. All subtractions of renormalization are assumed to be carried out at the origin of momentum space with the degree of divergence of a subtraction coinciding with the dimensionality of the corresponding subdiagram.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call