Data Strategies

What Do HGTV open concept, Healthcare and BI Have in Common? | ThoughtSpot

Who would have thought that Home and Garden Television (HGTV) could change medicine? But it is. That TV show where people search for beachfront bargains and tiny houses is going to change medicine. 

Let me tell you how.
According to a study published in 2016, “For every hour physicians provide direct clinical face-time to patients, nearly 2 additional hours is spent on the Electronic Health Record (EHR) and desk work within the clinic day,” researchers wrote. “Outside office hours, physicians spend another one to two hours of personal time each night doing additional computer and other clerical work.” 

That was a relatively small study with only 57 physicians, but let’s assume it is directionally correct. The trend that doctors have been experiencing is that an extraordinary amount of time is being spent by physicians on “systems work” and paperwork, not on direct patient care. 

Being married to a physician, I have personally witnessed this change in my wife’s practice over the past 25 years.
So, what does that have to do with HGTV?
If you’ve ever watched the HGTV shows on house-hunting, the people on the show are encouraged to “speak out loud” while looking at the properties:

“I really like this open concept.”

“I love this kitchen island.”

“These colors would have to go.”

“This backyard is just too small for our kids to play in.”

“I was hoping for something closer to our old neighborhood.”

And my personal favorite: “This tiny house is just too small.”
When companies perform usability studies of their software, the participants in the tests are also encouraged to “speak out loud” while looking at the website and describing what they are trying to accomplish and what they see on the screen. That usability dialog might go something like this: “I’m on the contacts page and trying to look up the phone number for John Doe. I’m pretty sure he lives in the Chicago area. OK, I see a contacts link and a search icon. I’m going to try the search icon. I typed in John Doe. Damn. I got back 85 results and there is no obvious way to narrow the list down based on city. Let me go back to the contacts link and see if that gives me better search options.”
In both of those examples, people are speaking out loud while they are trying to accomplish a task – in the case of HGTV, the goal is to buy a new house. In the case of usability testing, the goal is to provide valuable feedback to the product development team.
The connection to HGTV occurred to me when I overheard my wife talking out loud as she prepped for her upcoming week of patient care. 

“This patient looks really sick. What are his vitals?” 

“Who was on-service last night?” 

“Did they order the lab-work that I requested on Friday?” 

“Where are the lab results? I don’t see the lab results anywhere.” 

“I have to call the Fellow and get the story.” 

“When was this kid last seen in the clinic?” 

For that specific patient, all that dialog happened in about 5 minutes. She then went on to prep for her other 19 patients. Easy enough to do the math: 20 patients x 5 minutes = 100 minutes. Time spent on the EHR.
Imagine how medicine would change if we could capture all of that “out loud” dialog and present it in a super-simple, meaningful way to the physician?
The solution would involve a combination of natural language speech recognition, artificial intelligence, push-based analytics, and a UI that could be easily customized based on the needs of the doctor.
The good news is that many of the building blocks to create a solution like this are available today.
Natural language (NL): Alexa, Google Home and Siri have popularized speech recognition, even if it is still not perfect. Interactive Voice Response (IVR) systems for airlines and banks and utility companies have improved significantly. The ability to capture and act on spoken language has improved to the point where this concept is doable.
Artificial Intelligence: Even if all the words were captured correctly by the NL capability, interpretation and learning still needs to come into play. When the doctor says “Who was on service last night?” or “I need to speak with the Fellow” or “What tests did she run last night” – each of those statements have ambiguity that need to be worked out and clarified – if the system makes the wrong assumption about “who” was being referred to, the doctor needs to be able to help the system learn and self-correct.
The capabilities of push-based analytics  are getting much more mature now. Take the example of a few of these statements.

“This patient looks really sick. What are his vitals?”

Behind the scenes, the system understands who the patient is, and which vitals are important. A query could run for “patient vitals last night” and might display them in a narrow ticker window on the screen.

“Who was on-service last night?”

Behind the scenes: “provider on-service last night” – answer presented in the ticker window.

“Did they order the lab-work that I requested on Friday?”

Behind the scenes: “labs ordered Friday” and displayed on the ticker window.
Of course, doctors aren’t always sitting at their desk, so this could easily be extended to the doctor firing off a bunch of questions into their phone while they walk from one patient room to another. Behind the scenes, the system could capture the questions and queue them up … ready to be displayed when the doctor is ready for the information.
Creating technology for healthcare is no simple feat. Doctors are already over-burdened by the demands of the EHR and the paperwork that has come to dominate their jobs. Hospital IT staffs are overwhelmed just keeping up with the care and feeding of the core IT infrastructure, much less implementing new capabilities that can co-exist within their IT systems. The approach I have described here would require significant work to bring together natural language, AI and analytics – but it’s already being done with next-generation tools that are available today.

So, the next time you hear someone say “I love this open concept” – think how great it would be if your doctor could say, “were John Doe’s test results ordered yesterday and were they normal?"