Think Tanks
Printable version

The King's Fund - Implementation and scaling of AI in health and social care

Introduction 

The NHS 10 Year Health Plan includes an aspiration for AI to be seamlessly integrated into most clinical pathways and for generative AI to be widely adopted. For this to be achieved, the NHS needs to be able to implement and scale AI tools that address system priorities and patient needs. 

AI has the potential to transform services. However, there is a risk that only the most digitally advanced providers will be able to use AI and so not all patients and members of the public will benefit. It’s important therefore that providers can scale and spread the adoption of AI technologies. 

Most health care AI tools being developed and piloted are not at the technical frontier of AI development – programmers are rarely pushing technical boundaries and software engineers are unlikely to be designing models that have never been seen before. However, the application and implementation of generative AI is new, which brings fresh challenges and different constraints when it comes to workflow, integration and information governance – for example, safety and privacy considerations, biases, cyber security, etc. 

AI requires the correct technical and social infrastructure to be in place. In our long read Infrastructure for innovation: getting the NHS and social care ready for AI, we outlined more information on what this infrastructure entails. 

In this long read we explore how different parts of the health and social care system are approaching implementation, scale and translation – from research and innovation to clinical practice. It’s based on conversations with a broad range of people, including AI specialists, GPs, dentists, researchers and innovators. Our discussion focused on ‘How are staff and innovators implementing and scaling AI?’. We’ve included some of the responses throughout this long read.  

Start with building momentum for change 

To move towards the use of more sophisticated tools such as AI, it’s important to have a culture that is willing to explore and embrace digital transformation. Health and care has made ongoing investments in digitalisation, including online triage in GP practices, electronic patient records in hospitals and social care, digitalisation of pathology, and imaging networks. However, when it comes to training, education, and changing workflows and processes, there has been underinvestment, creating a disenfranchised culture towards technology. Both the 10 Year Health Plan and the Darzi reviewacknowledge the frustration staff have regarding technology in the NHS. 

The Darzi review was clear – many staff find technology more of a burden than a solution. Staff can be reluctant to engage with AI tools or feel concerned that they will add complexity and burden to already busy workloads. Digital leaders and champions can help shift this perspective by focusing on building skills and training to increase digital confidence. Leaders need to recognise and understand staff frustrations and develop a culture of enthusiasm for technology in the workforce. But for this to happen, staff need the time and space to engage with technology, and this must be prioritised by leadership.

The trust recently went through their EPR (electronic patient records) digitisation, moving from paper about a year ago. So there's been a lot of energy and investment in overall digital transformation for the trust, which has been led well and created a positive culture for change.
Specialty registrar

Staff can struggle to find time to learn about and engage with innovation when the priority is providing care to patients and reducing waiting lists. National policy often makes both delivery of services and innovation a high priority, but there’s an unacknowledged tension between the two and staff time for innovation needs to be protected. 

If all your clinicians are doing patient care, you have to accept that there’s other things that then aren’t going to happen. Your staff haven’t got time to do all of the service improvement and digital transformation projects as well as all the patient care you’re asking them to do. 
AHP digital lead

If staff are not involved in the development and implementation of AI, they will have limited knowledge, understanding and skills on how to use it and how tools should be managed. This results in a reluctance to use AI and lower than expected impact and benefits. 

We learned from previous projects that if you just hire IT people to implement a solution it doesn’t work. So you need to have clinicians and IT people, and in this case, patient safety team members, who are doing the day-to-day work involved right from the get-go to understand the specific aims, goals, outcomes and processes they need. 
Specialty registrar 

Recommendations 

  • Dedicate and protect staff time so that staff can engage in transformation and training as clinical champions or peer networks. 

  • Take a continuous improvement approach with staff, listening to and engaging with them to understand their concerns, worries and fears, and act on these to improve technical functionality and evolve workplace culture. 

Scaling in partnership and clusters

Across national, regional and integrated care board areas there is significant variation between providers, including different levels of digital maturity, digital confidence and staff skills, as well as differing levels of risk appetite and technical capabilities. Despite this, shared challenges and aims are helping build cross-provider collaborations. Groups of organisations – for example, a group of ambulance trusts in Yorkshire and a group of hospital trusts in Northamptonshire – are working in partnership to sift AI opportunities, develop tools and scale AI. These collaborations share funding, learning and resources to filter and select AI opportunities that can resolve shared problems. 

These partnerships can also share governance structures and peer support networks. They have established innovation boards, which help to sift AI suppliers by matching them to shared challenges. This helps to make the increasing number of suppliers manageable, but it’s a fine balance between a robust decision process and adding more delays to implementation, as the NHS is often criticised for being slow to adopt innovation. So staff need to ensure the decision process has the checks once and at the most appropriate place in the development and implementation cycle. 

Those organisations that are more digitally advanced and have larger technical workforces can take the lead in developing and/or testing solutions that the partnership has identified as important. They then share their learnings to help other organisations navigate AI implementation. This approach enables providers to share costs and reduces duplication. The innovation board monitors innovations that have been piloted and tested within the cluster. If the pilot is successful, the board then recommends these solutions to others in the group. 

You need a set of organisations with people who are ready to just try things. The next set of organisations are interested when we start to do validation. They’re asking, ‘Did it work before?’ But almost every organisation we speak to after that says, ‘If you've done it, we'll do it too and we were going to spend £100K on doing something this year so we might as well, like, wait and see what you do.’ 
Digital transformation director 

Partnerships enable the development of peer support networks to help staff navigate unfamiliar situations. They bring together networks of staff responsible for clinical safety, information governance, data protection and cyber security, helping them to develop a shared understanding on safety, data protection and governance, which in turn helps providers to scale tools more quickly. 

By working together in clusters, organisations can share their resources and data and improve the quality of information available. This partnership approach also helps to improve AI tools as they are tested on more varied data, increasing the likelihood that the solutions will be more robust and adaptable to different health care settings and patient needs. 

We’re now building these networks to try and get video data for lots of centres and over much longer periods of time, so I think we’ll soon reach a point where there will be more videos to train a model than any person could have done in a career. 
Neurosurgeon 

Recommendations 

  • The new region model should facilitate provider leaders to form networks with similar organisations to lead or support AI development and implementation. 

  • Regions should encourage and facilitate the development of peer support networks to aid staff development and scaling of tools. Peer groups for clinical safety, information governance and cybersecurity are critical enablers to scale AI innovations. 

  • Parts of the system with little innovation infrastructure, such as general practice and social care, need investment and capacity building to support this way of working. 

Scaling in light of system variability 

Across the NHS and social care, digital technologies, workforce digital skills and confidence can vary greatly from organisation to organisation. This variability impacts the effectiveness of AI tools, making evaluation difficult to assess or transfer across providers. 

All these different levels of digital maturity impact on capability, for example in London you’ve got some very technically capable trusts and if you go to other parts of the country like Cornwall, where we’re forever told that they they’ve got relatively immature systems, the impact is not the same. It’s not a level playing field at all. 
Healthcare technology assessor 

Differences across organisations can mean a transformative AI tool for one organisation is unsuitable for another – for example, one interviewee explained how they don’t have a problem with a shortage of radiologists so are not prioritising AI solutions around radiologist workforce shortages. Another interviewee shared how pathology services demonstrate the variability through the different processes used to stain and prepare tissue, which changes image quality and affects AI efficacy: 

There’s a lot of manual process [in pathology] for samples, such as how you put them into a fixation, how you stain them, but also how you scan them. All of this creates a change in the data and there’s no standardisation on the scanners. So that means needing to take into account changing colours, morphing, flipping, switching and gathering data from multiple sites, different places. 
Pathology AI supplier 

This variation across organisations and regions can make it difficult to determine which tools are useful and whether it’s possible to scale them. However, some factors are outside a provider’s control. For example, some emerging AI tools may require changes to staff roles or a mix of different staff roles. If there is insufficient workforce, funding or ability to recruit, then the AI tool may not be able to be used in that service due to workforce constraints. 

When seeking to scale solutions and learn from other providers, staff need to be able to understand and test for the technical, workflow and staffing dependencies needed to use the tools. The tools may also be customisable, which can help to meet the needs of the organisation. 

Recommendations 

  • As part of the improvement capability of the region model, there should be investment in quality improvement capabilities to assess how variability across providers can be minimised to enable scaling of AI. 

  • There needs to be investment in staff time, capacity and funding so that clusters of organisations can streamline scaling AI. 

Local decisions to adapt solutions 

As with any diagnostic tools, AI will have maximum sensitivity and specificity values (ie, how accurate the tool is), meaning that there will be a proportion of incorrect results, false positives and false negatives. Some sites choose to set their AI tools to lower thresholds to tune the AI performance to the capacity and workflow of a department. Some departments are intentionally choosing to lower the threshold meaning that there will be more false positives for staff to manually check. The department relies on there being sufficient workforce capacity to check these additional false positives. This builds in tolerance and risk management. 

Site 1 made the decision to set the thresholds of false positive predictions based on the capacity of the radiologists to report those images. Whereas Site 2 has made the decision that we are not going to adjust the thresholds to reduce false positives and instead accept that radiologists will have a relatively high number of false positives that they will be able to dismiss when they look at the image. But this is a decision on capacity and the tolerance of technology being deployed in impacting workflow. 
Clinical director of innovation 

Adjusting the thresholds and performance parameters of an AI tool can affect staff workloads. That’s why it’s important for leaders to understand the implications and take cross-organisational approaches to agree how to set, manage and account for implications. For example, an AI tool that audits the consistency of the ambulance call process may enable a move from auditing a sample of calls to auditing all calls. But if the AI incorrectly flags a high number of deviations from the process, then it could mean more staff time is needed to manually check these compared to previously when only a sample of calls was audited – increasing the staff resources required. 

If AI implementation focuses on a single task then it is less likely to have potential benefits and could have unintended consequences. For example, an AI tool that can carry out chest image analysis can rapidly analyse a large number of scans, creating a rapid increase in workload when it comes to clinical oversight, or an increase in demand for the next contact point with a GP or consultant. Instead of single task-based AI implementation, AI needs to be considered as part of a pathway, recognising where staff time can be reassigned elsewhere to mitigate pressures. This also enables greater benefits – for example, by linking AI systems that can analyse images to those that can schedule appointments and tests to release more staff time. 

Recommendations 

  • Providers need to develop processes to decide on thresholds and specificity of tools to match organisational needs and workforce capacity. 

  • Regions, integrated care systems and providers need to work together to develop regional approaches for cross-organisation transformation to ensure the use of AI improves patient care across the entire interaction not just at one touch point. 

Summary 

Based on conversations with dozens of AI and digital experts – including clinicians, service leaders, researchers and innovators – it is clear that health and care services need to put as much energy and priority into helping staff become more confident with AI as they do the technical infrastructure. 

Patients are often frustrated by their experience of digital services and notice the fragmentation, as different services are provided by different providers. Slow implementation of AI and the limited spread of AI tools could exacerbate these differences, especially where AI tools are transforming services and releasing staff time. 

To ensure widescale benefits, the NHS needs to be able to work in partnership across providers to share priorities and develop AI tools that work across organisations. There also needs to be capacity and capabilities to optimise the AI tools, and processes and workforce skills to maximise their utilisation to local service requirements. Leaders and the emerging Integrated Care Boards and DHSC Regions have responsibilities to work collaboratively to identify shared challenges where AI can improve care outcomes. This requires good cross-organisation working, supporting staff and enabling scalability.

Original article link: https://www.kingsfund.org.uk/insight-and-analysis/long-reads/implementation-scaling-ai-health-care

Share this article

Latest News from
Think Tanks

Annual Review 24-25