Abstract

In a conventional light water reactor loaded with a range of uranium and plutonium-based fuel mixtures, the variation in antineutrino production over the cycle reflects both the initial core fissile inventory and its evolution. Under the assumption of constant thermal power, we calculate the rate at which antineutrinos are emitted from variously fueled cores, and the evolution of that rate as measured by a representative ton-scale antineutrino detector. We find that antineutrino flux decreases with burnup for Low Enriched Uranium cores, increases for full mixed-oxide (MOX) cores, and does not appreciably change for cores with a MOX fraction of approximately 75%. Accounting for uncertainties in the fission yields, in the emitted antineutrino spectra, and the detector response function, we show that the difference in core-wide MOX fractions at least as small as 8% can be distinguished using a hypothesis test. The test compares the evolution of the antineutrino rate relative to an initial value over part or all of the cycle. The use of relative rates reduces the sensitivity of the test to an independent thermal power measurement, making the result more robust against possible countermeasures. This rate-only approach also offers the potential advantage of reducing the cost and complexity of the antineutrino detectors used to verify the diversion, compared to methods that depend on the use of the antineutrino spectrum. A possible application is the verification of the disposition of surplus plutonium in nuclear reactors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call