Monday, January 27, 2014

All-Knowing Data Gods

Big data, data standards, blah, blah, blah-blah, blah-blah, listen to the rhythm of the failing data-driven anything.  Why does leveraging big data or developing industry standards matter when internal processes have no methodology to propagate those data standards throughout the enterprise?  Don’t get me wrong.  I am all for developing industry standards.   Applications of big data, like that mentioned in my previous posts, we need such approaches in healthcare.

“… precise and consistent data collection over time points to trends, helps maximize outputs, and empowers people from the basement to the C-Suite.”  Those are my words.  No need to convert the converted.   
This blog directs blah, blah, blah-blah, blah-blah… directly at the attitudes which purport that some all data-knowing external entity will solve, what are inescapably, internal process problems with data standardization. 

Look at the problem in this demonstration of maximizing efficiency and best practices.  Gather four employees and one contractor in a room.  Put them shoulder to shoulder.  Make sure the contractor is between the fourth and fifth persons from the left, as viewed when facing the group.  Stand about three meters in front of this group.  These people represent five “enterprise” software applications which can house key asset management data: Biomedical Equipment, Plant Systems, IT Services, the Contractor’s Database, and the Asset Ledger.   

Next, let the contractor choose how to participate in this process.  Then, go whisper in the ear of the first application on the left that this information must be passed to the next application in line.  This information is, “We bought a CT Scanner, model number 1234, serial number 45678.  It cost us $1,700,000 and has another $300,000 in software and other requirements associated with it.  The infrastructure requirements are $25,000.    Levels of service and service costs for hardware and software are ….” And so on.

The first application implements the current best practice to insure key data is propagated throughout the enterprise.  Meaning, the application:
1.       Steps out of line
2.      Leaves the room
3.      Proceeds to an office down the hall
4.      Spits out four sheets of paper for an employee sitting at a desk
5.      Returns to the proper place in line.

The employee at the desk:
1.       Looks at the paper
2.      Deciphers it
3.      Determines what actions should be taken
4.      Puts it aside for weeks
5.      Weeks later, writes new instructions on a sheet of paper
6.      Delivers this different sheet of paper to the second application waiting in line.

The process is repeated until it gets to the contractor.  Only, the contractor chose not to remain in line and doesn’t receive key information about IT support.  The process skips the contractor and continues with the third application stepping out of line to find the appropriate employee that will notify the fifth application. 

As the leader, at what point would you stop this best practice?    Farcical, isn’t it?  It is not hard to see that the applications should just pass the information to the other without manual inputs.  Yet, this is what’s going on in hospitals.

The image below shows what a hospital may have as far as “enterprise” applications which store key capital asset information.  In many places, each of these have different processes that simply pass information to the user in the form of paper.  At best, someone runs a query.  The recipient takes that information and inputs the same data into another application which is a ridiculous waste of time and increases chances of error.          
So, why do industry standards matter when internal processes have no methodology to propagate those data standards throughout the enterprise?  If capital equipment or any item shows up on the receiving dock with a standardized label or RFID label, what are the chances that information will be propagated throughout the enterprise?  If it isn’t propagated though out the enterprise, what good is it?  Only to the point of knowing what was ordered, so staff can run it down if there is a recall.  Every step that requires an unnecessary redundant input increases the chance for error and omission.  That is fleeing from data standardization. 

Throw in a healthcare information system in which equipment is ordered for patient treatment.  Does the standard label matter there?  How about a high level decision support system where Return on Investment is important but one has absolutely no handle on expense?  There are consulting agencies that will tell what should be spent.  Leadership can lay down the law to make it so without even understanding how the application of the consultant’s big data may not even be relevant to the hospital system. Why does leveraging big data matter when internal processes have no methodology to propagate those data standards throughout the enterprise? 

Here is the real catch.  What else can a leader do but make the best decision based on experience, staff experience, and the information he or she has at that moment.  No one has faith in the internal data and even less faith that data quality can actually occur in a timely manner and in some meaningful way for leadership to use.  

So, what good is leveraging big data or developing industry standards when no methodology exists to propagate data standards throughout the enterprise?  Very little.  The all-knowing data gods help those who help themselves.

2 comments:

  1. It looks like the method will be used to analyze data is very important. Even the employees are very good in analyzing, but they must be supported by excellent infrastructure company. It is the responsibility of the company by providing the best software.
    Harry - Assetpoint.com

    ReplyDelete
    Replies
    1. Thanks Harry. I appreciate your comments. I see a great application as part of the answer.

      Delete