BackgroundProviding learners with standards, which represent correct solutions to a task, is a promising means to support self-assessment. To date, however, the evidence in support of standards mainly stems from studies that used tasks for which correct solutions are very similar to each other in terms of both structural and surface features. By contrast, for tasks for which correct solutions are similar to each other only in terms of structural features, empirical evidence is scarce. AimsThe goals of the present study were to investigate the effects of standards in tasks for which correct solutions are similar to each other only in terms of structural features. SamplesParticipants were NExp1 = 139 and NExp2 = 170 university students. MethodsUsing the task of generating examples that illustrate previously encountered declarative concepts, we varied whether in self-assessing the quality of their examples learners received standards (with vs. without) and structural comparison/processing support (with vs. without) that supported learners in processing the critical structural features of the standards and their own examples. ResultsStandards increased self-assessment accuracy and performance. The effects concerning self-assessment accuracy could be enhanced by additionally providing structural comparison support. ConclusionsWe conclude that standards are a promising means to support self-assessment accuracy in tasks for which correct solutions are similar to each other only in terms of structural features. However, their effectiveness depends on the degree to which learners compare their products and the standards in terms of structural features, which needs to be supported.