Abstract

Methods of calibrating circuits for measuring partial discharges and radio-interference (r.i.) voltages are discussed. It is shown that discharge-measuring circuits, whether using a peak-measuring discharge detector or an r.i. meter, are best calibrated by applying a charge-quantity pulse calibrator of known repetition frequency in parallel with the test sample. The calibration of circuits for measuring r.i. voltages is more complex, and certain anomalies arise when a sine-wave signal generator is used to calibrate the circuit in terms of voltage input. It is established that, for a pulse-repetition rate of 100 pulse/s, it is possible to relate the results obtained from a conventional peak-reading discharge detector to those from a quasipeak r.i. meter; e.g. for a meter having 60? input impedance and a bandwidth of 9 kHz, the relationship is: 1 ?V is equivalent to 2.6pC. At repetition rates other than 100 pulse/s, it is shown that theoretically the maximum error involved by using the above relationship is 6 dB between 25?2000 pulse/s. Practical measurements confirm that the error does not exceed ±6dB over a wide range of pulse sizes and repetition frequencies. Comparative measurements have shown that the conventional r.i. meter is approximately an order of magnitude less sensitive than a tuned-circuit discharge detector. Methods of improving the sensitivity of the former are described. It is concluded that for many applications either a discharge detector or an r.i. meter will enable both discharge magnitude and r.i. voltage to be determined with sufficient accuracy, considering the erratic behaviour of pulses in practice.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call