Abstract

Since flux vortices are easily pinned by defects in superconducting radio frequency (SRF) cavities, trapped magnetic flux increases the surface resistance of the SRF cavity. It has been experimentally demonstrated that trapped magnetic flux can be expelled by cooling down a superconducting cavity under a large spatial temperature gradient. In this paper, we use the time-dependent Ginzburg–Landau (TDGL) theory to theoretically investigate the effect of defects on the flux expulsion ability of pure niobium superconducting cavities with an applied spatial temperature gradient. It is shown that the residual flux vortex density displays a notable stepwise decreasing trend with increasing temperature gradient at smaller residual fields, while it shows a stepwise rising with increasing defect size. Both larger defect number and size make vortices difficult to be expelled. Moreover, long cooling time is beneficial for vortices to be expelled, and the cooling time has nearly no significant effect on flux expulsion under conditions of large cooling times. The results in this paper provide some theoretical guidelines for improving the performance of superconducting cavities in practical situations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.