Since last October, we have published three papers describing the outcomes of the Delphi exercise to decide the content of the curriculum for each of five advanced training modules in pharmaceutical medicine [1-3]. The final paper [4] dealing with the sixth module and with course delivery and assessment appears in this issue (see page 105). At the outset, we explained the Delphi exercise [5], and now we can judge what it achieved. A panel of correspondents for each module was drawn from Fellows, Members and Associates of the Faculty and others. The six panels were more or less the same size (range 27-35) and the number who replied, some after several reminders, was acceptable after phase 1 (range 16-27) but less so after phase 2 (range 10-21). The panel for healthcare marketplace was the most responsive and the one for clinical development least so. Each panel was asked, in phase 1, to accept or reject a number of knowledge and skill statements and, if they so wished, to suggest other statements. A small percentage was proposed. In phase 2, panellists were requested to put each remaining statement into one of five categories; definitely necessary, important, worth considering, not important or definitely unnecessary. Weightings factors (10, 8, 6, 4 and 2) were applied, corresponding to the rank order of the five statements. The average weightings from all panellists was calculated. As a result, a large number of statements (n = 571) fell outside the top two categories. The number of statements accepted at the conclusion of the two phases (n = 364) was just over one third. Over the six modules, the range (29-87) for the number of accepted statements was broad. Clinical pharmacology had the highest number and medicines regulation the lowest. The percentage loss of statements over phases 1 and 2 was greatest for statistics and data management (77%) and lowest for clinical development (45%). These figures may appear worse than they really are, because the statements were each presented at three levels of competency: being able to perform the task alone and unaided, doing so as part of a team, or simply having an understanding of the principles. Even so, the Delphi exercise did demonstrate differences in how the iterative process worked for one module compared with another. The draft guidelines, issued in 1997 by the Faculty, provided a skeleton curriculum for each module [6]. For three modules (clinical pharmacology, drug safety surveillance, statistics and data management) there were simply six or seven headings, and for the other three each heading was expanded by several short statements. The Delphi exercise has provided a varying number of competency statements for each module: healthcare marketplace (83), clinical pharmacology (87), medicines regulation (29), drug safety surveillance (61), statistics and data management (39) and clinical development (65). Thus, 364 statements are now assembled, but there is slight duplication from one module to another, and in a few instances the competency is one that should be dealt with in basic training and assessed by the Diploma examination, so some editing will be done. Once these six curricula are approved, hopefully in the next few weeks, the proposed training, the potential providers of training and the methods of assessment can be finalised. In the paper in this issue, course delivery and assessment were surveyed, inviting 190 correspondents, of whom 67% replied, to identify what they felt was appropriate (18 out of 30 statements were accepted). These two aspects will form the focus of the Faculty's deliberations in coming weeks. The Delphi exercise took time, roughly a year, and required money and people to run it. Among the latter, the staff at Keele University applied themselves diligently and thoroughly, the advisers and the correspondents offered their help willingly, and the Faculty officers, staff and various working party members have committed many hours to it. Few medical specialities have been so original in deciding the curricula for training of their future specialists. It must rank as one of the biggest and most ambitious consultation exercises ever mounted in defining a higher education programme.