ICS leads must ensure a public health approach to wellbeing is central to their plan
I’m always fascinated to see how we humans like to embrace shiny new things. Historically, the UK has always been an early adopter of new inventions, technologies and ideas. World trade, the industrial revolution, the Internet and a national health service are all good examples of our country’s place at the forefront of innovation and development.
The latest buzz is around Artificial Intelligence (AI). The Government’s industrial strategy is encouraging significant investment in this area, and our fascination with data analysis is greater than ever. As far as public services are concerned, the opportunity for making use of new technology is huge, particularly for improving productivity (not just efficiency, but quality of outcomes). According to the innovation foundation Nesta, “UK local government is currently the Petri dish in which some of the most talked about new innovations are being tested, refined and scaled.”
However, before we ‘boldly go where no person has gone before’ in the search of enlightenment, there is one small thing that we mustn’t overlook: Human Intelligence (HI).
Over recent years, in response to austerity, there has been a huge reduction in the resources available within local public services to analyse and interpret trends and developments, and to devise new solutions.
As a result, it has become easier for councils to be hoodwinked by flashy presentations about the stories that the data seems to be telling us, and perhaps to be persuaded to devise strategies on the back of smartly-presented calculations about savings potential. This is not a new phenomenon – think of the Enterprise Resource Planning systems which councils invested in some 15 years ago. Many authorities were led to believe that ERPs would provide the mechanism for advanced and sophisticated planning that would enhance performance and productivity. But in most cases, the reality is that ERPs have turned out to be little more than a very expensive accounting platform.
There is also a tendency for the sector not to stop and ask why this is happening and why the system is behaving the way it is. When there is insufficient examination of cause and effect, and a inclination towards treatment over diagnosis, the result can be waste, loss of productivity and substandard services.
Delayed Transfers of Care (DTOC) of people in hospital who no longer require any clinical care provides a good example of this. Targets for improvement have been set by central Government, and there is plenty of analysis around who is stuck where, how discharges could be sped up, how reablement could be more efficient, what additional capacity is needed and so on.
But the issues are not being investigated properly. If DTOC was a police drama, you might say that people aren’t looking for the motive.
Our recent research with health and care professionals shows that at least 50% of older adults who end up in A&E don’t need to be there. The question we are therefore helping our clients in this space to answer is what they need to do differently in order to prevent the demand flow in the first place.
We are able to do this because we are working with individuals who want to ask the right questions, who are using their experience to explain what the data tells them, and who are able to reframe the problem in order to find a sustainable solution.
It is true that this approach probably requires more thinking up front, and a deeper understanding of the behaviour of the system – but in the longer term it is the best way to devise solutions that increase productivity and reduce costs.
It’s not clear where in the public sector the investment in people is coming from, where they are developing experience and where they are learning the intuition to ask the right questions.
New technology will no doubt support the development of public services, and in fact it is needed quickly. But local authorities should not let this change be technology-led; the sector should let HI determine how we use AI, and not the other way around.
We already know some of the potential negative uses of AI – for example influencing what news we read based on our past preferences and which people to follow on Twitter to reinforce our world view, rather than trying to provide us with a balanced view and challenge our assumptions.
So a plea to everyone investing in AI – don’t forget that HI is equally important.