The laser heterodyne radiometer (LHR) has the advantages of miniaturization, low cost, and high spectral-resolution as a ground-verification instrument for satellite observation of atmospheric trace-gas concentration. To verify the accuracy of LHR measurements, a new performance evaluation method is presented here, based on an ASE source and a CO2 absorption cell in the laboratory. Preliminary simulation analysis based on the system parameters of LHR is carried out for the performance analysis and data processing of this new combined test system. According to the simulation results, at wavelength deviation of fewer than 30 MHz, the retrieval error, which increases with bandwidth, can obtain an accuracy of 1 ppm within the bandwidth range of the photodetector (1.2 GHz) when this instrument line shape (ILS) is calibrated. Meanwhile, when the filter bandwidth is less than 200 MHz, the maximum error without ILS correction does not exceed 0.07 ppm. Moreover, with an ideal 60 MHz bandpass filter without ILS correction, LHR’s signal-to-noise ratio (SNR) should be greater than 20 to achieve retrieval results of less than 1 ppm. When the SNR is 100, the retrieval error is 0.206 and 0.265 ppm, corresponding to whether the system uncertainties (temperature and pressure) are considered. Considering all the error terms, the retrieval error (geometrically added) is 0.528 ppm at a spectral resolution of 0.004 cm−1, which meets the measurement accuracy requirement of 1 ppm. In the experiment, the retrieval and analysis of the heterodyne signals are performed for different XCO2 with [400 ppm, 420 ppm] in the absorption cell. Experimental results match well with the simulation, and confirm the accuracy of LHR with an error of less than 1 ppm with an SNR of 100. The LHR will be used to measure atmospheric-CO2 column concentrations in the future, and could be effective validation instruments on the ground for spaceborne CO2-sounding sensors.
Read full abstract