Abstract

Factors affecting the accuracy of ionization gauge measurements at low pressures are reviewed. In hot-cathode gauges these include electron-stimulated desorption at the electron collector, forward and reverse x-ray effects, the potential of the gauge envelope, outgassing, and various controller-related errors. In cold-cathode gauges they include nonlinearities below the “magnetron knee,” plasma instabilities, and background currents. Case studies are given to illustrate many of these sources of error and their elimination. The case studies were gathered in the course of long-term stability measurements on over 30 ionization gauges at pressures ranging from the 10−8 to 10−11 Torr ranges. The investigation included Bayard–Alpert (both conventional and modulated), extractor, inverted magnetron, double inverted magnetron, and magnetron gauges. Errors caused by outgassing from virtual leaks in all-metal seals are also considered. By far the largest and most unpredictable source of error proved to be electron-stimulated desorption in hot-cathode gauges, avoidable by operating the grid at a sufficiently high temperature. It is concluded that, with proper precautions, better then 10% reproducibility in the 10−10 Torr range is easily achievable over periods of many months with either hot-cathode or cold-cathode gauges. Over shorter periods, reproducibility of better than 1% is obtainable with well-designed hot-cathode gauges.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call