The COVID-19 pandemic accelerated the adoption of digital solutions across NHS trusts and GP practices.
From video and telephone consultations to remote vital signs monitoring; innovative technologies are helping to revolutionise the way healthcare services are delivered.
In this article we look at one of the less-known technologies, but one which could significantly improve the doctor/patient relationship, free up clinicians’ time, and deliver widespread efficiencies – speech recognition.
Dr Simon Wallace, chief clinical information officer at Nuance, creator of the Dragon Medical One speech recognition solution, explains: “Even before the COVID-19 pandemic there were real challenges with clinical documentation.
It now needs to be about looking at how technology can help with this background of an overwhelmed workforce and the challenge of capturing clinical documentation
“A survey we did with HIMSS showed that 85% of respondents within the medical community felt that administration tasks and documentation added to the burden of stress and burnout.
“Among NHS staff welfare has not been great, and that has been further exacerbated by the pandemic. So, it now needs to be about looking at how technology can help with this background of an overwhelmed workforce and the challenge of capturing clinical documentation.”
Capturing information
The COVID-19 pandemic highlighted the importance of capturing accurate and timely information to provide the best-possible care.
But traditional methods of capturing and collating this information often take up valuable time, with clinicians having to manually type clinical information into the electronic patient record.
Dr Wallace said: “We speak three times faster than we type and the modern speech engines of today are a very different beast compared to 10-15 years ago.
“Systems like Dragon Medical One are driven by AI and deep learning algorithms. Combined with a comprehensive medical dictionary, the recognition accuracy of complex clinical terms is excellent, allowing clinical notes to be created in real time.“And the clinician does not need to correct words as they speak, or go through the notes later to make any corrections; a process which takes a lot of unnecessary time. Instead, they say the words and an accurate interpretation automatically appears in the patient record.
“Speech engines are very good at that, and they can take into account different accents and omit any pauses and ‘ums’ and ‘aahs’ so they are only recording the required information.”
Improved intelligence
In the past 2-3 years clinicians have more readily embraced speech recognition solutions and, as such, the systems are becoming even more intelligent.
Dr Wallace said: “One key to its attraction among NHS organisations is that it is cloud based, so people are not tied to a particular device or computer. Instead, it offers clinicians the flexibility to be able to go to different parts of the hospital, whether it’s an operating theatre, ward, or clinic, even a different hospital. It also allows working from home, which was a key requirement for patient consultations during lockdown.
Clinical rooms of the future are likely to feature conversational AI, with computers being replaced by smart devices on the wall which have a multi array of microphones and sensors
“And, because it is all in the cloud, the burden on IT teams is reduced as the technical set up requirements are minimal.
“Another attraction for both clinicians and the healthcare organisation is that you do not have to train the technology; it can be used straight out of the box. However, it is always learning the subtleties of how an individual talks and thus improving the user experience.
“These benefits mean it is being extremely well received by clinicians as they can create notes so much more quickly and naturally.”
A knowledge database
And, once a speech recognition system is available in an organisation, it provides an AI platform to support other speech-based solutions that uses natural language processing (NLP).
Dr Wallace explains: “If the clinician speaks into the record stating that a patient is diabetic or has heart failure, the speech engine can prompt a question, for example, is it systolic heart failure or type 2 diabetes?
“This tool supports the clinician to ensure the clinical content of the record is as accurate and detailed as possible.
“Such NLP tools can also help the clinician to SNOMED CT their clinical terms as a seamless biproduct of creating their note.
“It can also link to knowledge databases, so if the system hears a clinical term, it can take the clinician to the relevant information or latest evidence about that particular disease, or drug, or treatment option.”
As well as these additional functions, speech has been found to increase the adoption of the electronic patient record (EPR).
Dr Wallace said: “As well as creating the content of a note, speech recognition can use voice commands to navigate around the electronic patient record.“This speeds up the workflow by removing the clicks, which can often overwhelm the user experience.”
And he predicts that in the future this technology will dramatically improve the experience for patients and clinicians, and change the way we design hospitals.
A balancing act
He said: “With COVID there was a shift to remote consultations and that was important to keep services going.
“We are now swinging back and there is going to be a balance between face-to-face and remote consultations as human interaction is still really important.
“In surgeries and hospitals, we are still going to need to have a bed for examinations, and there will still be chairs and medical equipment, but that image of a doctor hunched over a computer typing away to make sure everything being said is recorded will disappear.
The key is to build up tens of thousands of examples of doctor/patient interactions so the system knows how to ‘behave’ when pulling the final notes together
“Moving forward, AI-driven speech recognition will instead allow for more-natural interactions.
“Clinical rooms of the future are likely to feature conversational AI, with computers being replaced by smart devices on the wall which have a multi array of microphones and sensors.
“The clinician will be able to use voice commands to call up the latest MRI or CT scan results, while voice biometrics will allow the whole conversation between the doctor and the patient to be diarised.
“Behind the scenes, natural language processing pulls out the relevant parts of the consultation and creates a summary in the form of a structured SNOMED CT codified note.”
He adds: “As you can imagine, this is a really-complex process to train the AI system to achieve this outcome without the clinician ever touching the keyboard.
“The key is to build up tens of thousands of examples of doctor/patient interactions so the system knows how to ‘behave’ when pulling the final notes together.
“When we first started a human did a quality check before the final note was sent to the doctor for signature. But, as the AI learns and improves due to increased amounts of data being fed into the engine, the need for such human involvement will reduce and ultimately disappear. Suffice it to say, the doctor will always need to check the note before finally signing it off.”
There is a real opportunity, with the right funding, to accelerate the digital journey and give time back to clinicians to allow them more quality time with patients and to achieve a better work/life balance
And he concludes: “Moving forward, speech recognition has a major role to play in addressing the burden on healthcare staff.
“There is a real opportunity, with the right funding, to accelerate the digital journey and give time back to clinicians to allow them more quality time with patients and to achieve a better work/life balance.
“The service is burnt out and technology like this will play a really important role moving forwards.
“And a by-product of implementing it will be better interactions between patient and doctor and less time spent creating notes and inputting them into the EPR. “This will ultimately create a much-richer and much-better doctor/patient relationship.”