Following a Europe-wide survey, a model RDA was defined by Halkier and Danson (1997, p.245) as a body which, first, organisationally is in a semi-autonomous position vis-a'-vis its sponsoring political authority; second, supports mainly indigenous firms by means of ‘soft’ policy instruments; and, third, is a multifunctional and integrated agency, the level of which may be determined by the range of policy instruments it uses. The breakdown of this model in the period since then (Danson and Helinska-Hughes, 2003 & 2004) raises important issues for the dissemination of good practice. By contrast, leadership qualities, good practice, efficiency and effectiveness may have become to be associated with the smaller focused EDAs (economic development agencies) and not with the large traditional institutions such as Scottish Enterprise (SE) and the Irish agencies. Regional development agencies (RDAs) are at the forefront of those who place strong weight on the role of knowledge, knowledge transfer and the knowledge economy in economic development. To promote such an approach, strategies have been evolved which seek to recognise and build upon the idea of the ‘learning region’ (Cooke and Morgan, 2000). Attention is paid, therefore, to ensuring there is appropriate institutional capacity and thickness to support this corporate learning (Wong, 1999). These concepts are themselves believed to be underpinned by trust and cooperation between economic actors and, in turn, these are established and nurtured through networking and partnership working (Danson and Whittam, 1999; Danson and Cameron, 2000). This article reports on two surveys conducted to determine which agencies are considered to be ‘model’ and leading players: the organisations from which others learn. The article concludes that EDAs and experts tend to benchmark against a set of similar specialist agencies rather than the traditional leading model agency. In this context: • many EDAs lack benchmarking experience; • there is often little awareness of what others are doing; • as a result there is a lack of an industry standard to gauge the performance of SE and other EDAs; • there is uncertainty over which performance indicators are appropriate; and • the varying social, economic and political contexts and environments within which different agencies operate, etc. make it difficult for agencies and commentators to benchmark generally.
Read full abstract