Abstract

I would like to make a few comments on Professor Rutter's review of The National Evaluation of Sure Start that appeared in Child Adolescent and Mental Health, Vol. 11, No. 3. First, he makes a fundamental mistake when he claims that ‘Sure Start Local Programmes (SSLPs) do not have a prescribed curriculum, it being left up to each area to decide for itself what it wished to do’. This is simply untrue. There was a very clear ‘curriculum’ based on clear (and numerous) targets. The rationale, and I would say strength, of SSLPs was that they were allowed to take locally appropriate actions but against a fairly rigorous set of Public Service Agreements (PSAs) and Service Delivery Agreements (SDAs). The PSAs and SDAs were quite specific, relating to a wide range of outcomes, from the more global ‘One hundred per cent of families with newborn babies living in a Sure Start area to have been visited by a local programme within the first two months of their child's life’, to the quite specific ‘Achieve by 2005–6 a six percentage point reduction in the proportion of mothers who continue to smoke during pregnancy’. How each programme managed these PSAs and SDAs was up to its board and management. Rutter is right in describing this as a way of avoiding ‘mechanised rigidity’ but only in the process. The government's targets were exacting. Second, the SSLPs were not required to provide ‘five core services’. These were simply descriptors for areas that could be used for general guidance in overall delivery plans. Most services crossed several or often all of the five descriptors. I do share Rutter's bewilderment with the evaluation methods, and indeed disquiet at the vast costs involved. More fundamentally, the idea that the effect of such programmes could be measured so quickly is ridiculous. Breaking cycles of deprivation or changing attitudes to education, for example, will take decades as the HeadStart programme in America has demonstrated. I have seen beneficial change created by my SSLP but any truly robust data on the outcomes will take many years to emerge. Rutter feels that the ‘3 years since the implementation of Sure Start ought to have been long enough for the benefits to be evident’, a statement I find surprising for a social scientist, and which does not take account of reality. This is not a 3-year period mid-term of a continuing service. Sure Starts were notoriously slow to get up and running, particularly in the first three waves when people were still learning the ropes and creating the new models of shared working. My Sure Start was wave 6 and I was able to learn from the previous five waves and yet the first serious service delivery only began a year after the project was given the go-ahead, and mine was seen as an exceptionally quick starter. One relevant example is that the appointment of a mental health worker for our programme by our local mental health trust took 3 years. Not mentioned in the paper was the vast amount of local evaluation that helped inform practice and was shared nationally. This has created a body of evidence that could be a vast resource and which I hope will not be lost behind the headlines of the national evaluation. I would point to three great successes for my Sure Start Programme that Rutter does not really acknowledge sufficiently. First, the degree of user involvement that was built into the model; for example, a third of the board was reserved for parents. SSLPs often had great local involvement and ownership. Second, the extraordinary success of shared working where librarians would work with dieticians, midwives with social workers etc. to create services and activities that offered a more holistic approach to the consumer. Third, the creation of a non-establishment model that encouraged access to services through a range of gateway activities at family-friendly locations. This meant that we were able to reach families normally unwilling to use universal or targeted services. In this SSLP almost 90% of local families are registered with us. Those of us working in Sure Start Local Programmes were disheartened by the early results of the National Evaluation. Fortunately, our day-to-day experiences tend to offer re-assurance that what we have done has been worthwhile. Professor Rutter would be welcome to visit my programme and see for himself.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call