Abstract The Gauss-Newton method is a simple iterative gradient descent method used to modify a mathematical model by minimising the least-squares residuals between the modelled response, and some observed behaviour. A common issue for parameter identification methods that optimise least-square residuals is the sporadic occurrence of outlying data in the observation data set. This research proposes an amendment to the Gauss-Newton parameter identification approach that limits the influence of outlying data by dissipating the contribution of outlying data to the objective function that drives iterations. The modified method was tested in two and three-dimensional parameter identification exercises using virtual data from the dynamic insulin sensitivity and secretion test (DISST). The data incorporated random normally distributed noise (CV = 3%) or random normally distributed noise in concert with an outlying data point. The proposed method performed similarly to the original method when no outlying data was included and found the model that fit accurately to the majority of data points when an outlying data point was present. The proposed approach provides a valuable tool for the rejection of outlier data that is operator independent, does not require multiple stages of analysis, or manual removal of data.