Some thoughts on Revenue Cycle Management, predictive analytics, and competing algorithms

After some reflection, this is clearly Part 5 of the “What medicine can learn from Wall Street” series.

It occurred to me while thinking about the staid subject of revenue cycle management (RCM) that this is a likely hotspot for analytics.   First, there is data – tons of data.  Second, it is relevant – folks tend to care about payment.

RCM is the method by which healthcare providers get paid, beginning from patient contact, leading to evaluation and treatment, and submitting charges by which we are ultimately paid via contractual obligation.  Modern RCM goes beyond billing, to include marketing, pre-authorization, completeness in the medical records to decrease denials, and ‘working’ the claims until payment is made.

Providers get paid by making claims.  Insurers ‘keep the providers honest’ by claim denials when the claims are not properly 1) pre-authorized, 2)documented, 3)medically indicated (etc).  There is a tug of war between both entities, which usually results in a relationship that ranges somewhere between grudging wariness to outright war (with contracts terminated and legal challenges fired off).  The providers profit by extracting the maximum profit they are contractually allowed to, the insurer by denying payment so that they can obtain investment returns on the pool of reserves they have.  Typically, the larger the reserve pool, the larger the profit.
Insurers silently fume at ‘creative coding’ where a change of coding rules causes a procedure/illness that has previously been paid at a lower level now paid at a much higher level.  Providers seethe at ‘capricious’ denials which require  staff work to provide whatever documentation requested (perhaps relevant, perhaps not) and ‘gotcha’ downcoding due to a single missing piece of information.  In any case, there is plenty of work for the billing & IT folks on either side.

Computerized revenue cycle management seems like a solution until you realize that the business model of either entity has not changed, and now the same techniques on either side can be automated.  Unfortunately, if the other guy does it, you probably need to too – here’s why.

We could get into this scenario:   A payor (insurer) when evaluating claims, decides that there is too much spend ($) on a particular ICD-9 diagnosis (or ICD-10 if you prefer) than expected and targets these for claim denials.  A provider would submit claims for this group, be denied on many of them, re-submit, be denied, and then either start ‘working’ the claims to gain value from them or if they had a sloppy or lazy or limited billing department, simply let them go (with resultant loss of the claim).  That would be a 3-12 month process.   However, a provider that was using descriptive analytics (see part 1) on say a weekly or daily basis would be able to see something was wrong more quickly – probably within three months – and gear up for quicker recovery.  A determined (and agressive) payor could shift their denial strategy to a different ICD-9 and something similar would occur.  After a few cycles of this, if the provider was really astute, they might data mine the denials to identify what codes were being denied and set up a predictive algorithm to compare new denials relative to their old book of business.  This would identify statistical anomalies in new claims, and could alert the provider about the algorithm the payor was using to target claims for denial.  By anticipating these denials, and either re-coding them or providing superior documentation to force the payor to pay (negating the beneficial effects of the payor’s claim denial algo), claims are paid in a timely and expected manner.  I haven’t checked out some of the larger vendors’ RCM offerings but I suspect that this is not far in the offing.

I could see a time where a very aggressive payor (perhaps under financial strain) strikes back with an algorithm designed to deny some, but not all claims on a semi-random basis to ‘fly under the radar’ and escape the provider’s more simple detection algorithms.  A more sophisticated algorithm based upon anomaly detection techniques could then be used to identify these denials….  This seems like a nightmare to me.  Once things get to this point, it’s probably only a matter of time until these games are addressed by the legislature.

Welcome to the battles of the competing algorithms.  This is what happens in high-frequency trading.  Best algorithm wins, loser gets poorer.

One thing is sure: in negotiations, the party who holds & evaluates the data holds the advantage.   The other party will forever be negotiating from behind..

P.S.  As an aside, with the ultra-low short term interest rates after the 2008 financial crisis, the time value of money is near all-time lows.  Delayed payments are an annoyance but apart from a cash-flow basis there is not any real advantage to delaying payments.   Senior management who lived through or studied the higher short term interest rates of the 1970’s-1980’s will recall the importance of managing the ‘float’ and good treasury/receivables operations.  Changing economic conditions could make this even more of a hot topic.