AbstractTerrestrial hydrology is altered by fires, particularly in snow‐dominated catchments. However, fire impacts on catchment hydrology are often neglected from land surface model (LSM) simulations. Western U.S. wildfire activity has been increasing in recent decades and is projected to continue increasing over at least the next three decades, and thus it is important to evaluate if neglecting fire impacts in operational land surface models (LSMs) is a significant error source that has a noticeable signal among other sources of uncertainty. We evaluate a widely used state‐of‐the‐art LSM (Noah‐MP) in runoff and snowpack simulations at two representative fire‐affected snow‐dominated catchments in the Pacific Northwest: Andrew's Creek in Washington and Johnson Creek in Idaho. These two catchments are selected across all western U.S. fire‐affected catchments because they are snow‐dominated and experienced more than 50% burning in a single fire event with minimal burning outside of this event, which allows analyses of distinct pre‐ and post‐fire periods. There are statistically significant shifts in model skills from pre‐to post‐fire years in simulating runoff and snowpack. At both study catchments, simulations miss enhancements in early‐spring runoff and annual runoff efficiency during post‐fire years, resulting in persistent underestimates of annual runoff anomalies throughout the 12‐year post‐fire analysis periods. Enhanced post‐fire snow accumulation and melt contributes to observed but unmodeled increases of spring runoff and annual runoff efficiency at these catchments. Informing simulations with satellite observed land cover classifications, leaf area index, and green fraction do not consistently improve the model ability to simulate hydrologic responses to fire disturbances.