Interview: How artificial intelligence is transforming the doctor-patient relationship

Published: 23-Apr-2019

How AI-enabled digital speech recognition is poised to bring about efficiencies, improve outcomes, and bring doctors and patients face to face once more

Artificial intelligence (AI) is continuing to make its mark of the healthcare sector. In this article we look at how it is helping to take speech recognition a step forward, putting clinical staff back in touch with their patients

The NHS is charged with becoming completely paperless in less than 12 months, so the race is on to replace time-consuming manual processes with state-of-the-art technological solutions.

Many healthcare organisations have already implemented a number of digital solutions, primarily around the electronic patient record and image storage.

The potential for cloud-based speech recognition and other AI speech-embedded solutions is really exciting. It’s all about making the doctor-patient experience better and clawing back some of that wasted time on documentation and administrative tasks

But there is much more that can be done to improve efficiency in documentation functions and put healthcare professionals back on the frontline.

Nuance Communications has been at the forefront of digitising clinical documentation for several years. And the company has recently taken this another step forward, utilising the power of AI in its speech recognition solutions.

Under pressure

Speaking to BBH, its chief clinical information officer, Simon Wallace, explains: “There has been a huge amount of change for healthcare professionals, which has led to increased pressure, particularly on doctors.

“A recent survey by the General Medical Council (GMC) suggests that a quarter of medics feel ‘burned out’, with around half feeling routinely exhausted.

“As well as enormous work pressures and fears over patient safety and complaints, the increased amount of administration has challenged the actual time spent with patients.

“And, because healthcare has become so complex, the amount of time spent on clinical documentation has increased significantly.”

A survey by Nuance in 2015 found that clinicians were spending, on average, 11 hours a week creating clinical documentation. This figure was slightly more for doctors compared to nurses.

And, if you factor in lost and repeated documentation; staff can be spending up to half their time on paper-based tasks.

Our survey found that doctors feel overwhelmed by the amount of documentation they have to deal with

“Our survey found that doctors feel overwhelmed by the amount of documentation they have to deal with,” said Wallace.

“This inefficiency equates to three outpatient appointments per doctor, per week.”

So how can the very-latest technology help?

“Today, the majority of clinical letters are created using audio digital transcription. The doctor will dictate the content of the letter, with the resultant digital recording, either typed by the in-house secretary or outsourced to a transcription service, often overseas and at a high cost.

“Either way, the resultant letter needs to be checked by the doctor, any changed made by the secretary, and then it is finally signed off by the doctor before being sent to the GP with a copy to the patient.”

To address this and other documentation issues, whether it be the admission note, ward round note, procedure note, or discharge summary, Nuance has launched Dragon Medical One, a cloud-based speech recognition solution.

Underpinned by smart AI algorithms, the solution is 99% accurate due to its deep learning and neural network models.

The algorithms allow the software to continuously learn from the user over time, creating a clinically-rich system with the potential to do so much more than just writing up what a doctor has said.

As well as enabling better communication, cloud-based, AI-embedded speech recognition has the power to support the clinician in a range of other ways

Wallace explains: “Because it’s in the cloud, Dragon Medical One gives clinicians much-greater freedom to access the solution in different parts of the hospital, whether they are in theatre, in their office, or in clinics.

“And it’s so sophisticated it can be used straight out of the box with no individual voice training required.”

Doctors and nurses spend a significant amount of time generating clinical documentation

Doctors and nurses spend a significant amount of time generating clinical documentation

Straight out of the box

A major selling point for AI-powered speech recognition is the deep learning it provides.

This, combined with a comprehensive medical dictionary enables complex clinical terms to be correctly recognised and spelt.

But, much more than this, it is continually learning the subtleties of the individual’s voice, how they speak and how they create their sentences. This allows the recognition to become even more accurate over time, with those deep learning and neural network models providing a true quality user experience.

“As well as enabling better communication, cloud-based, AI-embedded speech recognition has the power to support the clinician in a range of other ways,” said Wallace.

“Natural language processing (NLP) or clinical language understanding (CLU) can provide a number of tools to improve the quality of the clinical document.

“To say a patient has heart failure is fine, but it is more clinically accurate to record it as left, right, diastolic etc. NLP can support the clinician in this way.

“Scanning a number of documents during a hospital admission, NLP can flag up possible missing diagnoses that clinicians may not have considered.

“For example, a clinician may dictate that a patient has a high white cell count in one note and on a previous note from the day before he may have recoded that the patient had been vomiting with possible aspiration. However, there has not been a diagnosis recorded that could link these two clinical terms.

I am now a total advocate for technology and how it can be used to positively impact on of the healthcare team and our patients

“NLP supports the clinician by highlighting this information and allowed them to update the electronic patient record if it is felt that a diagnosis, in this case aspiration pneumonia, is considered clinically relevant.”

One of the big challenges of the electronic patient record is the time spent entering data. One estimate suggests doctors spend an extra 30-40 minutes a day meeting this need.

“Doctors have been turning away from their patients and become consumed by the computer as result of the need to search for and enter clinical information,” said Wallace.

He describes the clinic room of the future and the concept of ambient clinical intelligence which allows the doctor to turn the chair back around and engage better with the patient while at the same time meeting their needs to both retrieve and create up-to-date clinical knowledge about their patients.

“Using a purpose-built multi-microphone healthcare device, a doctor and patient can be having a true face-to-face consultation while the AI speech-enabled technology is creating a coding a clinical note in the background in the EPR,” he said.

And he revealed how the frustration of the click mentality ‘bedevilling the adoption of EPRs’ is significantly paving the way for a more-productive and engaging consultation.

“The clinic room of the future requires a transition from a typing approach to a speaking approach for clinical documentation,” he said.

Recent trials of the Dragon Medical One speech recognition solution have been carried out in two outpatient clinics, an accident and emergency department, and in community services.

Personal care

At Homerton University Hospital NHS Trust, ballooning administration costs and a 17-day turnaround on outpatient letters – the NHS target is 10 days – were causing major bottlenecks in the system.

Deploying Dragon Medical One meant clinicians could create their outpatient letter in the electronic patient record at the point of care.

The turnaround time for GPs receiving the letter reduced to two days, with the patient often leaving the clinic with a copy of the letter itself.

Patients then benefitted from faster and more-personalised communication and there were fewer lost or missed appointments.

The improved process has also freed up secretaries to concentrate on patient-focused tasks rather than having to try to get on top of typing a backlog of digitally-dictated letters.

The results from the implementation of AI-powered technology showed that the time spent documenting care was reduced, the quality of the patient record was boosted, and the speed of communication improved

The cloud-based software-as-a-service solution integrates directly into Homerton’s existing Cerner Millennium electronic patient records (EPR) software and plays a key role as part of the hospital’s paperless working drive.

Paul Adams, head of clinical information system at the trust, said: “We’ve seen considerable month-on-month cost savings as we replace our transcription services with front-end speech recognition and we’ve also reduced expenditure by not having to invest in additional hardware or recruit scarce and expensive technical resources to run the software day-to-day.”

It is estimated that the trust has made a saving of £150,000 a year on outsourcing costs and the time taken to produce clinical letters has reduced from 17 to just two days. Spending on medical secretaries has also been cut by a third.

There were similar results at Oxford University Hospitals NHS Foundation Trust where clinical notes were taking, on average, 12 days reach the GP.

A Global Digital Exemplar trust, it was keen to address the problem using a cloud-based AI solution.

A pilot in the renal department over a three-month period resulted in the letter turnaround being reduced from 12 to three days, and even the same day if no further tests were required. The trust also found that outsourcing costs were reduced by 77% and it did not need to fill vacant posts for medical secretaries.

On the record

Both Homerton and Oxford are now in the middle of an enterprise rollout of the speech recognition in their respective outpatient departments.

The solution is also transforming A&E, mental health, and community services.

At South Tees Hospitals NHS Foundation Trust, Dragon Medical was implemented in the emergency department to help speed up the treatment and discharge pathway.

Wallace said: “The results from the implementation of AI-powered technology showed that the time spent documenting care was reduced, the quality of the patient record was boosted, and the speed of communication improved.

“The average time saving using speech recognition versus typing was about three-and-a-half minutes per patient. Extrapolating this average time saving per patient by the corresponding volume of patients attending the department every year and the total number of clinicians, the overall time saving would be 389 days, the equivalent of gaining almost two full-time ED clinicians.”

Using a purpose-built multi-microphone healthcare device, a doctor and patient can be having a true face-to-face consultation while the AI speech-enabled technology is creating a coding a clinical note in the background in the EPR

The quality of the patient record was also boosted with 86% of clinicians involved in the pilot agreeing that speech recognition enabled more-complete patient notes.

Nine out of 10 also felt that using speech recognition compared to handwriting and typing saved time, improved the quality of the notes, and increased the speed of communication with others.

And, at Worcestershire Health and Care NHS Trust, the main provider of community and mental health services across Worcestershire, staff caught up with a two-year backlog of administration in just three weeks using the system.

“Dragon Medical speech recognition transformed the way I work, and my life. My notes are more complete and accurate and completed on time,” said Karen Edwards, an occupational therapist at the trust.

“I am now a total advocate for technology and how it can be used to positively impact on of the healthcare team and our patients.”

Wallace concludes: “The potential for cloud-based speech recognition and other AI speech-embedded solutions is really exciting. It’s all about making the doctor-patient experience better and clawing back some of that wasted time on documentation and administrative tasks.”

You may also like