Abstract We present the results of an integrated laboratory and modeling investigation into the impact of stellar X-rays on cosmic dust. Carbonaceous grains were prepared in a cooled (<200 K) supersonic expansion from aromatic molecular precursors, and were later irradiated with 970 eV X-rays. Silicate (enstatite) grains were prepared via laser ablation, thermally annealed, and later irradiated with 500 eV X-rays. Infrared spectra of the 3.4 μm band of the carbon sample prepared with benzene revealed 84% ± 5% band area loss for an X-ray dose of 5.2 ×1023 eV.cm−2. Infrared spectra of the 8–12 μm Si–O band of the silicate sample revealed band area loss up to 63% ± 5% for doses of 2.3 × 1023 eV.cm−2. A hybrid Monte Carlo particle trajectory approach was used to model the impact of X-rays and ensuing photoelectrons, Auger and collisionally ionized electrons through the bulk. As a result of X-ray ionization and ensuing Coulomb explosions on surface molecules, the calculated mass loss is 60% for the carbonaceous sample and 46% for the silicate sample, within a factor of 2 of the IR band loss, supporting an X-ray induced mass-loss mechanism. We apply the laboratory X-ray destruction rates to estimate the lifetimes of dust grains in protoplanetary disks surrounding 1 M ⊙ and 0.1 M ⊙ G and M stars. In both cases, X-ray destruction timescales are short (a few million years) at the disk surface, but are found to be much longer than typical disk lifetimes (≳10 Myr) over the disk bulk.