How AI Can Actually Help Physicians

Some thoughts from the trenches

Imagine you’re a primary care doctor walking into clinic on a Monday morning. You have a long list of patients to see, some are your own, others you don’t know so well, managed by various providers in your practice. You have limited time to pre-chart before seeing them (you didn’t do it Sunday night, whoops) and you want to prioritize topics you should be focusing on. You open your EMR to find a concise summary for each patient prepared for you - what medical conditions they have, what happened since the last time they came to your clinic, if they’ve been taking their meds or not, and what medications the cardiologist changed since the last time you saw them.

You go into the room to see the patient and have a conversation about what’s on their mind and how they’ve been doing. You discuss their diabetes and recent A1c, clarify their questions about GLP-1 ads they’ve been seeing in Times Square, and order that colonoscopy they’ve been hesitant about before but are now open to getting.

Later that day, you open the EMR and find a note pended for your review - created by AI from your a transcription of your visit. A prior authorization letter for prescribing Ozempic is also ready, courtesy of your AI agent. You check your messages and see a thank you reply from your patient. The AI had sent them an explainer on GLP-1s and instructions on how to prepare for their colonoscopy. You click a few buttons, sign a few things, and log off by 6 PM, ready to do it all over again the next day.

If this sounds too good to be true, you’re partly right. This may be the future of AI integration in EMRs and medical tools, but we’re not there yet. There are a slew of companies eager to enter this space, and though there are still open questions over the technical and legal limitations of using AI in medical tools, it’s still important to discuss how AI can help with daily provider tasks.

A little caveat - this list will discuss how AI, particularly large language models (LLMs), can enhance physicians’ lives (inpatient and outpatient) specifically in the context of EMRs. There are of course use cases for back office and admin outside of physicians, and there are many other use cases for different healthcare professions (e.g., pharmacy, nursing, etc.). However, since I’m a physician, I thought it would be most helpful to consider this from the lens of a doctor and how I would want to use AI and LLMs to help me take care of my patients.

The other thing to think about is the role of AI in a physician’s workflow. Many companies are focused on the generative side of things - creating prior auth forms, answering patient messages, writing referral letters. This may be useful, but there are already templates in place or teams of people that work on specific problems (e.g. prior auths). This doesn’t mean that there isn’t a place for generative AI in healthcare, but I think a bigger help would be retrieval.

Retrieval is essentially talking to an AI/LLM as if it was another provider that has the knowledge of the patient’s chart. A chart is a big box of structured and unstructured data. Many times the chart contradicts itself depending on who has typed what and which other provider has copied forward something that was previously already in the chart but may not be accurate. An AI that can access all this data could enhance retrieval and ultimately could provide me, the physician, with clearer insights into a patient’s condition. Here are some examples where an AI within the EMR could help with retrieval.

Medical Timeline

I alluded to the first use case already in the example above - pre-charting or as I like to think about: a patient’s medical timeline. Whenever I see a patient in clinic, I look at the last time they were seen by me and then everything that has happened since then. Who did they see, why did they go to the ED, what calls or messages did they send and were they answered, what open questions should I discuss with this patient today?

This involves sifting through unstructured notes and text to construct a patient’s story, a task well-suited for AI (if you can get over the technical limitations of token limits and converting notes into vector databases and whatnot - I’m not an AI researcher plz don’t ask for details).

The output could look like a summary that pops up when you open the EMR, but a better version is probably what you’re used to seeing with AI: a chatbot that lets you talk to the chart and get questions answered without digging through 14 different telephone calls one by one.

Patient Sign Out and Follow-Up Questions

At the beginning and end of every shift, I sign out to the next person coming on. When I admit someone to the hospital from the ED, I get sign out (sometimes) from the ED team or piece together what they have done in a pretty standardized way by going into different parts of the patient’s chart (vitals, labs, imaging, meds, notes, etc.).

The chart may not have every single detail I need, but it contains most, and being able to talk to the chart to quickly get up to speed on a patient at sign out would be huge.

This is particularly helpful for lengthy hospital courses where so much has happened that your note looks like the next Lord of the Rings book and no one is actually reading it. It’s also helpful when you start on a new service and have a list of 10 patients you are meeting for the first time and there is an open question that wasn’t addressed in the sign out.

Moreover, if you’ve ever worked in a hospital, you know that people always want to go deeper into a certain medical issue. You can’t predict when, but eventually, the attending will ask what provoked a patient’s pulmonary embolism or when they first went into Afib and what may have caused it.

I think of these as deep dive questions into a specific medical problem. It usually involves me digging into the chart to look at previous admissions and outpatient visits to sometimes learn the answer but many times discover that it was a charting issue and that even though the allergies section says that cefepime causes hives, the patient was given cefepime in the ICU without issues. This is yet again a classic example of a retrieval problem that our AI friends should be able to solve for us.

Medication History

You might think determining which hypertension medications a patient has received and why certain ones were started and stopped would be easy. Sometimes a patient has an allergic reaction, other times insurance doesn’t cover the medication, and occasionally a cardiologist saw what you did and thought you were dumb and changed every single medication you ordered.

Countless examples of these types of questions arise in the hospital. From antibiotics for specific infections to different types of anticoagulation and why specific changes were made, the details of a patient’s medications aren’t as straightforward to dig through as you would expect - a perfect retrieval challenge for AI.

Imagine typing a prompt like, “What anticoagulation has this patient been on for their pulmonary embolism?” And getting back something like: “This patient was started on a heparin drip during their hospitalization that was then changed to eliquis but ultimately changed to coumadin.”

A follow-up question might be, “Why was it switched to coumadin?” and the AI would reply, “They switched to coumadin because their insurance didn’t cover eliquis, and they couldn’t afford to pay out of pocket for it. You can read more in the PCP note from 11/10/23. Here’s the link.”

That would be wonderful. And for those of you who think that this should be easy to search in the chart on your own, you clearly have not had enough time to interact with the Epic search bar.

Will they build it?

The obvious question is whether features like this will be built, by whom, and when we might see them implemented in daily practice. Though I want to be optimistic, I believe we are further from this reality than we think. The first hurdle is the technical challenge of having to take in each patient’s chart as an input for the LLM. This seems straightforward, and we’ve seen some demos of this, but large-scale implementation with extensive chart data is yet to be seen.

The other limitation involves legal issues regarding giving access of patient data to models (relatively easy to solve) and integrating new software into your existing EMR system. Current EMRs are likely better positioned to make these tools available. It’s much easier for Epic to include an AI chatbot in an upgrade that your institution can purchase than for a new company to sell a software solution that integrates with your EMR. Ecosystem, baby.

The last thing to consider is adoption. This is more pertinent to hospitals and practices than builders. Sure there are sales cycles and education on getting people up to speed with these tools. But the reality is that it is really hard to quantify the ROI of tools like this because they’re used on the fly. Admin tools have more tangible metrics, such as daily patient message responses achieved or how response time changes with AI. Quantifying pre-charting and chart digging is tougher, and institutions may not prioritize investment in it. Independent practices, where doctors make financial decisions, might find it more valuable, but larger academic institutions reliant on residents may not be as eager to improve workflows not directly linked to bringing in more hospital dollars.

Some Parting Thoughts

The AI hype train has been choo-chooing for a while, and even if you don’t drink all of the Kool-Aid it’s hard to deny that AI is going to slowly make its way into more areas of medicine.

For now, I think we are going to mostly see solutions from integrated tools in large academic institutions and possibly see individual companies go after the long tail of independent practices. There’s a significant focus on generative solutions that are making the headlines, but I really do think that solving the retrieval problem is something that could truly impact the day-to-day work of providers in a positive way and ultimately lead to improved patient care.