Mammen and Yu (MY) give an excellent account of various statistical estimation problems in terms of noisy integral equations of the second kind. In particular, the authors shed a new light on the smooth backfittingmethod for fitting additive models, which provides us with another way of understanding the existing theory and also with a useful tool for developing new theory for other statistical methods. The smooth backfitting method was first introduced by Mammen, Linton, and Nielsen (1999) as a way of fitting the additive regression model. The method is known to possess very nice theoretical properties as discussed in Section 4 of MY. Its success stories in theoretical aspects have been developed recently in several other nonparametric estimation problems, including those mentioned in Sections 6 and 7 of MY. There are two other main kernel methods for fitting additive models. One is themarginal integration technique of Linton and Nielsen (1995), and the other is the ordinary backfitting procedure of Buja, Hastie, and Tibshirani (1989). It is generally accepted that, in the case of the additive regression model, the smooth backfitting method is better than the other two, not only in theoretical performance but also in numerical stability; see Nielsen and Sperlich (2005). But there are few reported results comparing the performance of the threemethods for other regressionmodels. A particular case that occurs tome, but was not discussed in MY, is the varying coefficient additive model, originally introduced by Hastie and Tibshirani (1993). This model has received much attention recently. It inherits the simplicity and easy interpretation of the classical linear model, yet is intrinsically nonparametric, so it is flexible enough to accommodate various complicated relationships between the response and predictor variables. In this model, the mean regression function is expressed as