Abstract

Very little information exists on the amount of natural and artificial UV light required to cause sunburn and tanning in individuals with very pale skin who are at the greatest risk of developing skin cancer. We have investigated minimal erythema dose (MED) and minimal melanogenic dose (MMD) in a group of 31 volunteers with Fitzpatrick skin types I and II using an Oriel 1000 W xenon arc solar simulator and natural sunlight in Sydney, Australia. We measured the erythemal and melanogenic responses using conventional visual scoring, a chromameter and an erythema meter. We found that the average MED measured visually using the artificial UV source was 68.7 +/- 3.3 mJ/cm2 (3.4 +/- 0.2 standard erythema doses [SED]), which was significantly different from the MED of sunlight, which was 93.6 +/- 5.6 mJ/cm2 (P < 0.001) (11.7 +/- 0.7 SED). We also found significant correlations between the solar-simulated MED values, the melanin index (erythema meter) and the L* function (chromameter). The average MMD (obtained in 16 volunteers only) using solar-simulated light was 85.6 +/- 4.9 mJ/cm2, which was significantly less than that measured with natural sunlight (118.3 +/- 8.6 mJ/cm2; P < 0.05). We mathematically modeled the data for both the chromameter and the erythema meter to see if we were able to obtain a more objective measure of MED and differentiation between skin types. Using this model, we were able to detect erythemal responses using the erythema index function of the erythema meter and the a* function of the chromameter at lower UV doses than either the standard visual or COLIPA methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call