Abstract The most fundamental method for determining the absorbed dose to water by radiation beams is by measuring the consequent temperature increase of the water. Water calorimeters used for this purpose are not irradiated uniformly; thus spatial dose gradients within the calorimeter give rise to significant heat transport effects that interfere with the measurement of the small temperature rise associated with radiotherapy doses (e.g. 240 μK/Gy at a typical dose rate of 1 Gy/min). When subjected to periodic exposure to radiation, the calorimeter response (registered as temperature at a point in the water) reaches steady-state oscillation in which effects due to heat conduction and convection may be studied as a function of modulation frequency, so that appropriate correction factors can be derived for desired operating conditions. Theoretical treatment of the behavior is greatly simplified if experimental conditions can be arranged so that convection is negligible. While this is usually done by refrigerating the calorimeter to 4 °C, where the thermal expansion coefficient of water is zero (thus, buoyancy forces due to density fluctuations effectively vanish), we present evidence suggesting that such conditions can be achieved at room temperature by reducing the duty cycle, or effective “on” time of the radiation at a given shutter frequency. Results are presented here in the form of system transfer functions obtained under conditions of 50 and 3.5% duty cycle. Comparison of these measured transfer functions with the output of a three dimensional finite element model indicates that convection is greatly suppressed in the lower duty cycle case. Experimental uncertainty in these measurements is too large to conclude that convection is eliminated altogether, but more extended runs using this technique should enable a more definitive judgment as to whether the simpler conduction corrections alone can be applied.
Read full abstract