Setting the Bar High for AI in Patient Experience


04.26.2017 • Tad Turpen

Picking Fruit from a Footstool.pngThis post by NarrativeDx Chief Data Scientist, Tad Turpen, explains how the NarrativeDx AI platform has set the bar high for reliability. He advocates the need for better explanatory power and accuracy in commercial natural language processing and AI technology products.

At NarrativeDx, our goal is to understand exactly how patients perceive their own care.

The more actionable insights hospitals and healthcare providers have into their patients’ perceptions of their own care, the more likely they are to make real improvements to the care they provide.

Through our AI platform for patient experience, we use technology to discover valuable insights from patient feedback and help our customers extract trends and gain visibility into the patient experience. Our platform automatically processes large volumes of patient comments from HCAHPS surveys, post-discharge surveys, social review sites, and beyond to pinpoint specific areas for improvement and identify key staff, doctors, units and facilities.

As this processing takes place on our AI platform using natural language processing and machine learning technology, our analytical model is hard at work balancing predictive power (incredibly high, reliable accuracy) with explanatory power (the ability to explain why the prediction was made).

At NarrativeDx, we emphasize explanatory power because our platform is designed to help hospitals and providers make informed improvements in key areas based on patient feedback. Sparking change in a hospital or a provider’s behavior requires that they trust our analysis. Trusting an analytical model isn’t always something that comes naturally to providers. They have understandable questions around accuracy, transparency and accountability. That's why we support our analysis with individual data points - a key aspect of our explanatory power.

How NarrativeDx prioritizes explanatory power
We set both predictive goals and explanatory goals by tracking coincidental predictions. These are predictions that are technically correct, but the prediction does not have a satisfying explanation generated by the model.

Predictive goals are goals such as we want to get 96% precision on this type of data, and explanatory goals are goals more like we want less than 5% coincidentally correct predictions. We internally track the reason for correct predictions, which is a measure we can’t expose because that’d be like Coca-Cola putting their recipe on every can of soda ;-). When we review datasets, we track whether or not our predictions are correct or incorrect for the “right reason.” That is, we aren’t satisfied with just a correct prediction. We require our analysis to be accurate for the correct reasons.

Our attention to accuracy for the correct reasons recently led us to change the way we pre-process and segment our data. This change to our modeling yielded a prediction improvement of 5% for a model that was already scoring above 90%. This is just another example of how our emphasis on explanatory power guides the continuous improvement of our analytical modeling.

Making sense of it all
We look at it this way: analytical modeling is like trying to pick the best fruit from a short footstool. Tweaking opaque analytical models while ignoring explanatory power is like setting an unsteady ladder on top of a short footstool while blindfolded. You might be able to reach higher, and it might be easier to pick some low-hanging fruit - but the top of the tree will ultimately be beyond your reach.

We made the right choices early (by making sure we had the right reasons behind those decisions) when we built the NarrativeDx AI platform for patient experience. By balancing our predictive power with our explanatory power, we built a tall and steady structure from the start.

How do you get the tallest ladder possible? Pick a strong baseline, maintain the highest explanatory power possible and track coincidental true positives and true negatives.

Trust me, you’ll be happy that you made your design choices with the significance of explanatory power in perspective. You will have the most robust analytical engine in your industry - like NarrativeDx in patient experience - and your customers, even when they are professionally-trained skeptics like healthcare providers, will be delighted with your accurate and reliable data analytics.

Stay tuned for the next blog post in this Data Science blog series by our Linguist, Zach Childers, PhD. Next week, he'll do a deeper dive into how we tap into explanatory power to help our customers make sense of the sentiment behind patients' perceptions of their own care.

Tad Turpen is the Chief Data Scientist at NarrativeDx.

Tad Turpen

Tad Turpen

Chief Data Scientist, NarrativeDx

Get ”Insights into Patient Experience“ delivered to your inbox