Guest blogpost written by Adenike Tilleray, Head of Business Management, Adults’ Services, Ealing Council It is January 2020. A brand-new…
This article originally appeared as a blogpost on the Adam Smith Institute website and is reproduced with their kind permission. All views herein are those of the writer, Tim Ambler, and do not necessarily reflect those of IMPOWER.
Obviously all those wonderful people who provide it, and their clients, care but as one moves up the management hierarchy, the question becomes more difficult. For a start, no one in Whitehall has much responsibility for Adult Social Care. According to the King’s Fund “Spending on adult social care services by [English] local authorities fell from £18.4 billion in 2009/10 to just under £17 billion in 2015/16, a real-terms cut of 8 per cent” although that is being topped up with some funding from the NHS. The NHS overlaps with Adult Social Care notably for the elderly and those with mental health problems, the two fastest growing problems of our time. Better management of this overlap would make the NHS more cost effective and better focused. Yet Cabinet Ministers rarely discuss Adult Social Care. The Chancellor, in his hour-long autumn 2017 budget statement, made no mention of it.
Local authorities carry the can but they are handicapped by knowing neither what funding will be available from year to year nor the value for money they are providing. The Care Quality Commission (CQC), remarkably, evaluates individual homes but not the service provided by any local authority as a whole. A local authority cannot improve its offering if it does not know its position in the league table nor what it is doing wrong nor how other authorities are doing things better. Still less can they see the total national picture.
If we could agree the outcomes expected from each LAASC (Local Authority Adult Social Care department), and measure and publish that performance from year to year, the less successful could learn from those getting better results and/or better value for money. School league tables may be controversial but at least they identify where best practice may be found. Dividing outcomes by the annual costs of each LAASC as a whole and per client would give indications of productivity. Such measures inevitably will be crude but even so the apparently most productive should provide insights for those with less attractive indications.
IMPOWER, a consultancy working exclusively within the public sector, has taken some bold steps in this direction. For 150 LAASCs they measured costs and performance in three “domains”: older people’s services, all age disability and health and social care interface. The second domain’s metrics are muddied by children’s social care and the third by some NHS costs but they have had to make the best of the metrics available. If this broad approach was adopted nationally then more precise outcomes and measures of those outcomes could be agreed and synchronised, replacing the current data reporting so that no net additional bureaucracy should be needed.
For each domain, IMPOWER selected six to ten outcome measures, from the Adult Social Care Outcomes Framework, and then set those against published expenditure to calculate productivity scores (the most current nationally available data on which was from 2016/17 budgets and outcomes). This provided an LAASC productivity ranking for each domain. Each overall LAASC rank was the mean average of its three domain rank positions.
Clearly there is plenty of room for debating the outcomes, their measurement and the methodology for integrating them into useful performance and productivity indices. IMPOWER should be congratulated on getting the show on the road and encouraging LAASCs to improve what they have done. Motivation is key and therefore IMPOWER have elected only to publish the names of the ten most highly ranked LAASCs. Every LAASC will be told, if it asks, what its ranking was and how that was calculated but not how the other 139 fared.
To be effective, this approach will have to be totally transparent. Every LAASC should want to know how comparable LAASCs scored but the experience of introduced school league tables shows that such systems need to be introduced slowly and gently.
Reducing outcomes to single productivity indices is likely to be an over-simplification too far. An LAASC may be great for elderlies but hopeless for mental health. Therein lies much of the benefit from this type of analysis: highlighting the particular areas for improvement. The usual convention is to have a “dashboard” of the key indicators. These are used by children’s services departments and an agreed national standard set for LAASCs, monitored by the CQC, would serve this sector far better, and at much less cost, than a management hierarchy such as that in the NHS and the Department of Health.