The seawater refractive index is an essential parameter in ocean observation, making its high-precision measurement necessary. This can be effectively achieved using a position-sensitive detector-based measurement system. However, in the actual measurement process, the impact of the jitter signal measurement error on the results cannot be ignored. In this study, we theoretically analysed the causes of long jitter signals during seawater refractive index measurements and quantified the influencing factors. Through this analysis, it can be seen that the angle between the two windows in the seawater refractive index measurement area caused a large error in the results, which could be effectively reduced by controlling the angle to within 2.06°. At the same time, the factors affecting the position-sensitive detector's measurement accuracy were analysed, with changes to the background light, the photosensitive surface's size, and the working environment's temperature leading to its reduction. To address the above factors, we first added a 0.9 nm bandwidth, narrow-band filter in front of the detector's photosensitive surface during system construction to filter out any light other than that from the signal light source. To ensure the seawater refractive index's measuring range, a position-sensitive detector with a photosensitive surface size of 4 mm × 4 mm was selected; whereas, to reduce the working environment's temperature variation, we partitioned the measurement system. To validate the testing error range of the optimised test system, standard seawater samples were measured under the same conditions, showing a reduction in the measurement system's jitter signal from 0.0022 mm to 0.0011 mm, before and after optimisation, respectively, as well as a reduction in the refractive index's deviation. The experimental results show that the refractive index of seawater was effectively reduced by adjusting the measurement system's optical path and structure.
Read full abstract