Many terminal lakes in agricultural basins are prone to eutrophication due to restricted inflows and receiving excess nutrients from their basin.The synergy of using satellite data andmachine learning models is a low-cost way to monitor the root-cause water quality variables (WQVs) of eutrophication. This study investigates the potential ofremote sensing-based machine learning algorithms to modelchlorophyll-a(Chl-a), total phosphorus (TP), Secchi disk depth (SD),and Carlson trophic state index (CTSI)in the north part of Lake Urmia (LU).The multiple linear regression (MLR) and artificial neural network (ANN) models were developed using Landsat-8 (L8) and Sentinel-2 (S2) data with nearly concurrent in-situ WQVs of the north part of LU from February 2016 to January 2017. Results showed that models based on L8 were superior to those with S2. Moreover, the ANN models based on L8 for Chl-a, SD, and TP having NSE = 0.75, 0.98, and 0.96, respectively, outperformed MLRs (with NSE = 0.74, 0.81, 0.58). Applying atmospheric correction (i.e., ACOLITE, C2RCC, and C2RCCX) enhances the models.The resultant Chl-aand SD maps indicated an inverse spatiotemporal pattern that agrees with the variation of the abiotic condition in the lake (e.g., surface temperature and total suspended sediments).According to the CTSI maps,the north part of LU was mesotrophic in February and March and eutrophic between June and October 2016. Our study indicates the promising application ofremote sensing-based machine learning algorithms to model the spatiotemporal variation of eutrophication in LU, which provides valuable insights into cost-effective lake monitoring.