Abstract

In this paper, a matched condition is given to ensure maximum linear interval of quadrant detector output signal, the condition is a ratio of spot size to photosurface radius corresponding to the optimum linear characteristic of detector output signal. For the optimum detector output signal, a subsection linearization method is put forward to fit the characteristic parameters of the signal. The simulation results show that, when the condition and the method was used together, maximum linear interval of quadrant detector can be obtained, the linear interval of detector expands by 20 to 30 percent, and the average nonlinear error can decrease to 1/10, compared with multi-calibration and least square fitting in entire linear interval.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call