One of the major challenges for the design of future thermonuclear reactors is the problem of power exhaust—the removal of heat fluxes deposited by plasma particles onto the plasma-facing components (PFCs) of the reactor wall. In order for the reactor to work efficiently, the power loading of the PFCs has to stay within their material limits. A substantial part of these heat fluxes can be deposited transiently during the impact of edge localised modes (ELMs), which typically accompany the high confinement mode, a regime foreseen for tokamak ITER and next-step devices. One of the possible ways to mitigate the deposition of localised heat fluxes during ELMs is injection of impurities, which could similarly to inter-ELM detachment dissipate part of the energy carried by plasma particles, the so-called ELM buffering effect. In this contribution, we report on experimental observations in impurity seeded discharges in ASDEX Upgrade, where injection of argon is capable of reducing the ELM energy by up to 80 (60 without degradation of confinement). A simple model of ELM cooling is in some cases capable of providing quantitative prediction of this effect. The ELM peak energy fluence was reduced by a factor 8 without a degradation of the pedestal pressure. Should such mitigation be achieved in ITER, the resulting power loading would satisfy the material limits of divertor tungsten monoblocks (Eich et al 2017 Nucl. Mater. Energy 12 84–90) and as such avoid the risk of their melting. The most favourable results in terms of confinement and divertor heat flux mitigation were achieved by use of a mixture of argon and nitrogen, where the later impurity helped to improve the confinement. The ELM frequency was identified as a scaling factor for in discharges with impurity seeding, suggesting that high frequency ELMs are favourable for future devices.
Read full abstract