You could have the most accurate AI in the world, but it’s useless if people can’t use it without wanting to break something.
A team of South Australian researchers has developed a new framework to assess whether AI tools used in clinical settings are understandable, adaptable and actually helpful for doctors and their patients.
In recent years emergency departments throughout South Australian hospitals have had access to RAPIDx AI, a support tool that assists doctors and nurses quickly and accurately diagnose a range of cardiac conditions through rapid analysis of clinical and biochemical data.
And while the technical potential of the AI tool has been proven, questions still remain over its useability in dynamic healthcare environments such as the ED.
Now, a team of Flinders University researchers, led by Dr Maria Alejandra Pinero de Plaza (PhD) – a scientist whose work focuses on facilitating healthy living and better public health services, has put RAPIDx AI under the microscope to determine just how helpful it is in diagnosing heart issues.
“AI is becoming more common in healthcare, but it doesn’t always fit in smoothly with the vital work of our doctors and nurses,” said Dr Pinero de Plaza.
“We need to confirm these systems are trustworthy and work consistently for everyone, ensuring they are able to support medical teams rather than slowing them down.
“In order to understand if the AI systems are viable, we look at how easy they are to use, how well doctors and nurses adopt them, and how they impact patient care.”
To do this, Dr Pinero de Plaza and her team developed PROLIFERATE_AI, a human-centred evaluation tool that combines AI with researcher analysis to assess just how well tools such as RAPIDx AI work in hospitals.
The framework was developed to assess five areas regarding the integration of RAPIDx AI had into clinical workflows. These areas included comprehension (how well different hospital staff understood and engaged with RAPIDx AI), motivations (what made staff start and continue to use RAPIDx AI), barriers and optimisation strategies.
After collecting data from 20 clinical staff working in the ED – a mixture of ED consultants and registrars, residents and interns and registered nurses, the PROLIFERATE_AI framework revealed that while the more experienced clinicians showed high comprehension and engagement with RAPIDx AI, staff with less clinical experience found it more challenging to use.
Related
Consequently, the framework also highlighted the need for specific training and better workflow alignment interfaces to improve its use and useability among residents, interns and other new users.
It recommended hospitals consider using gamified training modules and other on-demand support tools for less experienced users, as well as simplifying the user interface and pairing inexperienced users with “super-users” for further hands-on guidance and confidence building.
“What sets PROLIFERATE_AI apart is its ability to provide actionable insights,” said Dr Pinero de Plaza.
“We want to set a new standard for AI implementation, fundamental care and evaluation standards… Our goal is to create AI solutions that empower doctors and nurses, not replace them.”
The Flinders University team recently received $5000 in funding from the CSIRO to assist with the ongoing refinement of the predictive modelling and implementation of the PROLIFERATE_AI tool. You can read more about the evaluation of RAPIDx AI by PROLIFERATE_AI here.