In this paper, we present a micromagnetic modeling study of the recording signal-to-noise performance of the heat assisted magnetic recording (HAMR) processes, focusing on the effect of possible grain-to-grain heating variation. The temperature variation is modeled by assuming a small random deviation of the steady state thermal profile for each grain. It is found that the temperature variation from grain to grain can yield significant signal-to-noise ratio degradation, enhancing transition jitter, and the limiting performance high linear densities. The degradation is characteristically the same as that caused by grain-to-grain Curie temperature, Tc, variations. Even though it is hard to distinguish in recording performance testing from Tc variation, a quantitative understanding of this grain-to-grain heating variation in practical conditions is of importance for advancing HAMR as a viable future recording technology.