We consider the problem $\ve^2 \Delta u + (u-a(x))(1-u^2)=0$ in $\Omega, \frac{\partial u}{\partial \nu}=0$ on $\partial \Omega$, where Ω is a smooth and bounded domain in $\R^2$, $ -1 <a(x) <1$. Assume that $ \Gamma = \{ x \in \Omega, a(x)=0 \}$ is a closed, smooth curve contained in Ω in such a way that $\Omega = \Omega_{+} \cup\Gamma\cup \Omega_{-}$ and $\frac{\partial a}{\partial n } >0$ on Γ, where n is the outer normal to $\Omega_+$. Fife and Greenlee [Russian Math. Surveys, 29 (1974), pp. 103–131] proved the existence of an interior transition layer solution $u_\ve$ which approaches ‐1 in $\Omega_{-}$ and +1 in $\Omega_{+}$, for all ε sufficiently small. A question open for many years has been whether an interior transition layer solution approaching 1 in $\Omega_{-}$ and ‐1 in $\Omega_{+}$ exists. In this paper, we answer this question affirmatively when $n=2$, provided that ε is small and away from certain critical numbers. A main difficulty is a resonance phenomenon induced by a large number of small critical eigenvalues of the linearized operator.