Total Body Water (TBW) and Total Energy Expenditure (TEE) are routinely measured in free-living conditions by the (2)H2(18)O method. Isotope eliminations can be measured from spot urine samples by HTC-EA IRMS, but only after cumbersome cryogenic distillation to extract water. Distillation may, however, be replaced by charcoal treatment and filtration. This study tested (1) the effect of sample treatments (filtration versus distillation) on the isotope ratios, (2) the effect of different ways of normalization that respect or not the principle of identical treatment of the sample and references, and (3) the impact on the biological outcomes. Two filters (PES membrane; 10 kDa) accepting volumes of urine samples (V500: 0.5 mL versus V6: 3.0 mL) were tested. In-house water standards and in-house urine standards were prepared and normalized against the international scale to calibrate the urine samples. The δ(2)H and δ(18)O values from water in the urine were measured by HTC-EA IRMS. Filtered urine normalized with water standards showed a bias in the δ(2)H values that was corrected when calibration was performed with urine standards. At a δ(2)H value of 1101.4‰, the accuracy increased from -11.9 to -0.2 δ‰ (V500) and from -3.8 to 0.4 δ‰ (V6). The TBW errors were greatest with V500 and water calibration (1.20%) and lowest with V6 and urine calibration (0.34%; preparation-by-calibration interaction p = 0.027). For the δ(18)O values the accuracy of enrichments and TBW were not affected whatever preparations and normalization were used. The average TEE was not affected but the variability increased from 0.6 to 2.7% versus cryogenic distillation. Cryogenic distillation remains the gold standard for small sample size experiments where small changes in TEE are to be detected. Filtration offers an alternative for large-scale experiments. When the body composition is derived from (2)H2O dilution, it is strongly recommended that urine standards should be used to eliminate the effect of filtration.