No matter how well a program is designed, its effectiveness is largely dependent upon how it is implemented (Dusenbury, Brannigan, Hansen, Walsh, & Falco, 2005). Yet programs rarely evaluate how a program is implemented and instead focus on demonstrating outcome achievement. Program implementation consists of four factors: fidelity (adherence to curriculum), quality (how a program is delivered by staff), adaptation (how a staff person might change a program), and participant responsiveness (engagement and fit with the program; Carroll, Patterson, Wood, Booth et al., 2007). In this paper we use a municipal youth recreation program that implemented a mentoring curriculum to illustrate how one recreation program conducted an implementation evaluation. This program used a structured journal as the primary implementation evaluation tool. Recreation program staff used the journal to track adherence to the mentoring curriculum (fidelity), identify staff facilitation strategies (quality), describe adaptations made to the mentoring curriculum, and indicate participant attendance and engagement (responsiveness). The mentoring sessions involved 29 youth and resulted in three staff completing 232 structured journals over the 8-week-long program. Responses were quantitatively and qualitatively analyzed. Fidelity scores indicated a high level of curricular adherence (98%). Quality (defined by the type of facilitation strategy used) varied significantly between staff. The adaptation data showed that staff made several adaptations to the curriculum, which were important in highlighting areas for improvement. The program was well attended and participants reported moderate levels of satisfaction with the program. Collectively, these data illustrated the importance of using a broader approach to evaluation that goes beyond simply identifying whether outcomes were achieved, to identifying how a program was implemented in order to understand its effectiveness.