AI in healthcare — where we stand, what to know

In the halcyon days before ChatGPT, there was a lot of chatter about what artificial intelligence (AI) would actually do for different businesses/industries. We saw cool displays like AlphaGo and AlphaGo Zero destroy the world’s best Go players (which we were actually writing about all the way back in 2017). But when would the rubber meet the road? When would we actually start seeing AI in our everyday lives? Well, ChatGPT answered that question a hundred-fold this year. But before ChatGPT, we were already starting to see applications of AI in healthcare. In our minds, it’s still one of the most promising frontiers in machine learning with real-world applications we’ll see in the very near future.

So, we though it would be worth a look into the current state of AI in healthcare — where we stand know, what might happen in the near future, and what you need to know about AI in healthcare at this moment.

The 5 biggest areas for AI in healthcare

From the research we’ve been doing (which includes scouring medical journals if you can believe it 🤣), we see five (5) primary areas for both applications and developments for AI in healthcare.

1) Electronic Health Records

Electronic Health Records (EHR) are the backbone of most hospital and doctors’ operations. They were mandated as part of the Affordable Care Act — for better or worse — and medical care revolves around those records. What procedures were done, what needs to be done, drugs taken, medical images associated with your care… it’s all housed in an EHR.

Most EHRs are “systems of records and storage” with limited capabilities. Machine learning can transform EHRs from “systems of records” to “systems of intelligence”.

What does that look like in practice?

Take big data and leverage insights across millions of records to better assess diagnoses, prognoses, recommended course of treatment, likelihood of recurrences, etc.

According to a medical journal, this is what it could look like:

In healthcare, there is a massive amount of unstructured textual data in the forms of doctors’ notes, test results, lab reports, medication orders, and discharge instructions. Natural language processing tools can be used to extract critical information about patients from such rich descriptive data, helping improve diagnoses and treatment recommendations. The capacity for machines to digest huge amounts of imagery and textual data quickly through ML and NLP will enable physicians to make timely diagnoses and treatment decisions, which can have profound impact on health service delivery, particularly on the ways that patients are treated.

This probably won’t impact how you individually interact with an EHR, but it could provide massive improvements in care and diagnostics if EHR companies could actualize the massive amounts of data inside their digital black boxes.

One of the main drawbacks of EHRs? It changes the clinical relationship between doctor and patient

How often have you sat in a doctor’s office, speaking to your doctor while they type away furiously at a computer terminal?

It’s really not their fault — it’s what they have to do. Everything has to be captured in the EHR, and the doctor (or their staff) has to enter that while they’re in clinic with you (or shortly thereafter).

Natural language processing could revolutionize this interaction in a way that would benefit both doctors, theirs staffs, and patients. Doctors could use speech-to-text dictation to immediately input notes into the machine without requiring them to be highly paid stenographers.

No matter what, EHRs stand to get huge upgrades thanks to AI in the very near future.

2) Personalized treatments

The rise of genomic testing — specifically the ability to do it quickly and relatively inexpensively — has opened up huge pathways for more personalized care (you see this with cancer immunotherapy treatments already).

The addition of AI to this equation could completely change the game.

AI could crunch massive amounts of data to determine not only optimal treatment courses, but tailor that to the information gleaned from your genomic testing. So instead of a boilerplate treatment plan determined at the population level (you have Tumor “X”, prescribe treatment “Y”), your care could truly be personalized based on your genes + insight from AI datasets (you have Tumor X.1432782, prescribe treatment Y.286414164).

This could also incorporate things like diagnostic imaging (X-Rays, CT scans, MRIs, etc.) and AI’s improved ability to recognize patterns in images to further personalize treatments for yet higher efficacy.

3) Medical imaging

This one has been in the works for quite a while, but radiologists have some of the hardest jobs in medicine. They need to look at scores of images, all the time, and make diagnoses based on their findings from those images.

As AI has improved its image recognition, it has seeped into medicine.

Most experts don’t think AI will replace radiologists, but rather extend their capabilities by doing first glance analysis for radiologists. The AI can highlight areas likely to be tumors (or what have you) that radiologists can focus more closely on. It can also act essentially as a copy writer, checking radiologists for any areas they might have missed.

4) Patient engagement

One of the great advancements in EHRs is patients being able to access their own health data. AI can take this a step further by suggesting courses of action to improve outcomes based on intel from every other patient who has undergone a similar treatment plan (and in the near future, with similar genomic markers). This could mean rehab, scheduling check ins, sending reminders, integrating with your calendars, etc.

AI also promises to provide more (and more valuable) data to your medical providers through AI applications integrating with mobile devices and wearables. If you need to achieve certain exercise thresholds for full recovery, your Apple Watch could record and log that data, immediately integrating with your EHR to let your medical team know you’re doing what you’re supposed to be doing (or likewise, prompting you to get with the program so you’re giving yourself the best chance to heal fully).

You’ll also almost assuredly start to see chatbots within EHRs so that patients can ask general questions about their treatment plan and get immediate answers, powered by robust data.

There are some hurdles here, as you need patients to opt in to these sorts of things, but it provides huge potential if executed upon at a high level.

5) Exploratory applications

The final area we see a ton of potential for AI in healthcare is the cutting edge stuff. The real bleeding edge stuff that most of us can barely fathom.

Take Google’s AlphaFold breakthrough — in recorded human history, we have mapped the precise genomic structure of a few dozen proteins. It’s massively expensive, incredibly difficult to observe the microscopic in true 3D, etc. etc.

Then AlphaFold came along and mapped basically all 200 MILLION possible proteins using AI.

That’s the kind of cutting edge, exploratory stuff we’re talking about. AlphaFold could be a Nobel-Prize winning development some day, given what it will likely empower researchers to build and develop (experimental drugs, etc.).

We obviously don’t have a ton of insight into what these exploratory applications will be because they’re by definition cutting edge… But, we felt it was still worth including given it’s the 5th main pillar of what we see coming out of AI in healthcare.

Wrapping up

There’s a ton of ways AI will change how we live and work to come. Healthcare is one of the areas where we’re actually seeing these applications start to take hold, which is truly exciting to behold.

If you’re in this space — or any industry, really — and are contemplating AI integrations to take your business to the next level, give us a shout. We’d love to chat about how we can help you leverage the power of AI to take your business a step into the future.



Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

11 + thirteen =

Jeff Francis

Jeff Francis is a veteran entrepreneur and founder of Dallas-based digital product studio ENO8. Jeff founded ENO8 to empower companies of all sizes to design, develop and deliver innovative, impactful digital products. With more than 18 years working with early-stage startups, Jeff has a passion for creating and growing new businesses from the ground up, and has honed a unique ability to assist companies with aligning their technology product initiatives with real business outcomes.

Get In The Know

Sign up for power-packed emails to get critical insights into why software fails and how you can succeed!

EXPERTISE, ENTHUSIASM & ENO8: AT YOUR SERVICE

Whether you have your ducks in a row or just an idea, we’ll help you create software your customers will Love.

LET'S TALK

When Will Your Software Need to Be Rebuilt?

When the software starts hobbling and engineers are spending more time fixing bugs than making improvements, you may find yourself asking, “Is it time to rebuild our software?” Take this quiz to find out if and when to rebuild.

 

is it time to rebuild our software?