Space-based ground-imaging lidar has become increasingly feasible with recent technological advances. Compact fiber-optic lasers and single-photon-sensitive Geiger-mode detector arrays push designs toward low pulse energies and high pulse rates. A challenge in implementing such a system is imperfect pointing knowledge caused by angular jitter, exacerbated by long distances between satellite and ground. Without mitigation, angular jitter would cause significant blurring of the 3-D data products. Reducing the error in pointing knowledge to avoid such problems might require extreme mechanical isolation, advanced inertial measurement units (IMUs), star trackers, or auxiliary passive optical sensors. These mitigations can increase cost and size, weight, and power considerably. An alternative approach is demonstrated, in which the two-axis jitter time series is estimated using only the lidar data. Simultaneously, a single-surface model of the ground is estimated as nuisance parameters. Expectation–maximization is used to separate signal and background detections while maximizing the joint posterior probability density of the jitter and surface states. The resulting estimated jitter, when used in coincidence processing or image reconstruction, can reduce the blurring effect of jitter to an amount comparable to the optical diffraction limit.