Abstract

Radiofrequency catheter ablation is a mainstay in the treatment of a variety of symptomatic arrhythmias. It has proved to be safe and effective in this application. Heating of tissue with radiofrequency energy in the intracardiac environment leads to interesting thermodynamic phenomena. Most of these are favorable, some are not, but they are frequently not recognized, and sometimes misunderstood. Article see p 838 The mechanism of tissue injury in response to radiofrequency energy ablation is thermal.1–3 During catheter ablation, radiofrequency electric current passes into tissue and results in resistive heating of that tissue, in a manner similar to the heating of a filament in an incandescent light bulb. This direct resistive tissue heating (also referred to as “volume heating”) is proportional to power density and, therefore, falls off rapidly with increasing distance from the electrode source (1/r4).4,5 Most tissue heating, in turn, is not due to direct resistive heating, but rather is a result of conducted heat from the narrow rim of resistive heating into deeper tissue layers. If the depth of resistive heating can be increased, the depth of conductive heating increases proportionally, and the lesion size and volume are increased. Unfortunately, the maximum power that can be delivered to the catheter tip is limited by the risk of overheating at the electrode-tissue interface. If 100°C is exceeded, then boiling of blood will occur, with protein denaturation, and the formation of coagulum and char. …

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call