Quasars are bright and unobscured active galactic nuclei (AGN) thought to be powered by the accretion of matter around supermassive black holes at the centers of galaxies. The temporal variability of a quasar’s brightness contains valuable information about its physical properties. The UV/optical variability is thought to be a stochastic process, often represented as a damped random walk described by a stochastic differential equation (SDE). Upcoming wide-field telescopes such as the Rubin Observatory Legacy Survey of Space and Time (LSST) are expected to observe tens of millions of AGN in multiple filters over a ten year period, so there is a need for efficient and automated modeling techniques that can handle the large volume of data. Latent SDEs are machine learning models well suited for modeling quasar variability, as they can explicitly capture the underlying stochastic dynamics. In this work, we adapt latent SDEs to jointly reconstruct multivariate quasar light curves and infer their physical properties such as the black hole mass, inclination angle, and temperature slope. Our model is trained on realistic simulations of LSST ten year quasar light curves, and we demonstrate its ability to reconstruct quasar light curves even in the presence of long seasonal gaps and irregular sampling across different bands, outperforming a multioutput Gaussian process regression baseline. Our method has the potential to provide a deeper understanding of the physical properties of quasars and is applicable to a wide range of other multivariate time series with missing data and irregular sampling.