Context. Explaining the currently observed magnetic fields in galaxies requires relatively strong seeding in the early Universe. One of the current theories proposes that magnetic seeds on the order of μG were expelled by supernova (SN) explosions after primordial fields of nG strength or weaker were amplified in stellar interiors. Aims. In this work, we take a closer look at this theory and calculate the maximum magnetic energy that can be injected in the interstellar medium by a stellar cluster of mass Mcl based on what is currently known about stellar magnetism. Methods. We consider early-type stars and adopt either a Salpeter or a top-heavy initial mass function. For their magnetic fields, we adopt either a Gaussian or a bimodal distribution. The Gaussian model assumes that all massive stars are magnetized with 103 < ⟨B*⟩< 104 G, while the bimodal, consistent with observations of Milky Way stars, assumes only 5 − 10% of OB stars have 103 < ⟨B*⟩< 104 G, while the rest have 10 < ⟨B*⟩< 102 G. We ignore the effect of magnetic diffusion and assume no losses of magnetic energy. Results. We find that the maximum magnetic energy that can be injected by a stellar population is between 10−10 and 10−7 times the total SN energy. The highest end of these estimates is about five orders of magnitude lower than what is usually employed in cosmological simulations, where about 10−2 of the SN energy is injected as magnetic. Conclusions. Pure advection of the stellar magnetic field by SN explosions is a good candidate for seeding a dynamo, but not enough to magnetize galaxies. Assuming SNe as the main mechanism for galactic magnetization, the magnetic field cannot exceed an intensity of 10−7 G in the best-case scenario for a population of 105 solar masses in a superbubble of 300 pc radius, while more typical values are between 10−10 and 10−9 G. Therefore, other scenarios for galactic magnetization at high redshift need to be explored.