The Truven Health Blog

The latest healthcare topics from a trusted, proven, and unbiased source.


Elements of a Big Data Strategy

By Truven Staff
Brian O' Sullivan imageA recent article in Healthcare Informatics discussed how the University of California Irvine Medical Center (UCI Medical Center) is applying big data technologies to reduce avoidable readmissions, enable new research projects, and track patient vital statistics in real time. At Truven Health, we see these technologies as enablers to help our customers realize their goals for improving care. The technology itself is one aspect of a multi-faceted program to realize value from your data assets. These elements include data governance, data operations, infrastructure, data management and applications, and analytics to engage your community.

A well defined data governance program will allow your organization to realize value from the myriad and potentially disconnected data sets you are collecting. Master data management, defined business rules and metadata standards all enable downstream analytics. Data quality is paramount in any sophisticated data intelligence program. It is critical to establish a foundation of trust in the data.

The nuts and bolts of data operations may seem mundane, but when you’re responsible for petabytes of data, especially HIPAA-regulated healthcare data, you can’t afford to ignore the basics. Security and privacy requirements, systems management, data access, government regulations, encryption and where necessary, data obfuscation and de-identification, are all daily operational necessities that will need attention.

A variety of infrastructure options are available in the market today. When choosing one, take into consideration the level of control you need, where your strengths lie, and where your focus needs to be. Options include public cloud offerings, “build your own” environments from the ground up, and hybrid models, such as managed service environments that provide full featured platform capabilities.

Data management is at the heart of any big data strategy. Hadoop is the “big data” software technology and has become an enterprise-grade solution for data processing and enabling advanced analytics to be applied to even the largest data assets. There are a number of available distributions of this open-source software platform Truven Health has implemented the distribution from Hortonworks. Truven Health Unify™, built on Hadoop, brings clinical and administrative data together, makes it uniform, and then applies advanced analytical methodologies.

Software solutions then enable your organization to realize value from this integrated and analytically sound data asset. User-centric solutions, such as data visualization and workflow applications, engage your community of practitioners, providers, payers, and patients to change behavior and adopt new practices. By establishing a data asset as a trustworthy foundation, objective views of performance and practices can be developed and a data-driven agenda can be pursued.

Brian O’Sullivan
VP, Technology Strategy and Enterprise Architecture

What Data Is Needed to Run an At-Risk Organization?

By Truven Staff
Larry Yuhasz imageThe recent article, “Seven Changes the Affordable Care Act will Likely Encourage in the Medical System,” discusses several new approaches to healthcare that rely on effective management of new data sources and data streams. The ripple effects of the Affordable Care Act will take on many dimensions, ranging from the operational work flow of a health network, to the revenue cycle of those entities going at risk, to the relationship between patients and providers, to the way providers prioritize and spend their time with patients. Given this level of dramatic change, Truven Health Analytics is developing development partnerships with select customers to focus on the required flow of information that will be needed to run at at-risk organization, and the types of analytics and decision support various roles throughout a health network will need to have.

At the center of these activities is the fundamental requirement to establish a single patient record that accumulates knowledge of the patient through each and every encounter. Furthermore, the data collected needs to be organized and acted upon given specific temporal requirements. There is data used for an initial encounter, data for diagnosing, data for monitoring treatment effectiveness, and data for determining overall quality and effectiveness over time. Each requirement has specific conditions, and potentially, limitations, based upon how robust the single patient record is or is not. For example, encounters with new patients where no background information exists will be treated differently than encounters where there is a rich patient history of information. Likewise, encounters with healthy patients may provide the opportunity to collect new data insights into behavioral measures that can be used to keep them healthy, whereas patients with chronic conditions will likely require insights collected related to improving compliance to care guidelines.

In many respects, we may see a future where each encounter has both a patient care and an information care component with it. In fact, patient care and the required work flow is intimately connected to the information gleaned from diagnosis and the eligibility and payment and risk requirements the encounter triggers.

Larry Yuhasz
Director for Strategy and Business Development

Population Health Analytics: The Devil Is Truly In the Details

By Truven Staff
Grant Hoffman image“Population Health” is an oft-discussed topic, but the definition is variable depending on the vantage point of the presenter. Likewise, “Population Health Analytics” attempts to measure and improve an array of risk-bearing, clinically-integrated activities, ranging from aggregate risk analysis to predictive interventions at the point of the care.

Regardless of your particular turf, some common challenges lurk behind the application of analytics to these business challenges. The roadblocks stem from the fundamental fact that the data sources on which you depend for decision-making were not captured with cross-encounter analytics in mind. Source IT systems such as EMRs, billing systems, and electronic prescribing solutions were constructed to accomplish transactional goals for siloed provider organizations, not to support improved outcomes and cost control across the patient care continuum.

 We’ve identified three areas of focus to help you avoid pitfalls: 
  • Anticipate information-sharing challenges: Technical integration of data isn’t the hard part. The tough stuff is setting the trust conditions for authentic multi-stakeholder data sharing and governance.
  • Navigate the context of data creation: Operational processes obscure analytic classification of data, terminology standards are variable, and information arrives at different periodicities. Amidst this noise, reliable prediction, reporting, and alerting all require an “analytically-aware” implementation of data streams and measures.
  • Start with analytics you can take action on: Massive projects get everyone excited, but a moon launch isn’t necessarily your first step. Work backwards from where you have operational capacity to make improvements (basic quality measures across the continuum of care? risk and disease prevalence? alerting and interventions?) and focus your attention on a set of trusted measures to get you there.
Watch our four-part video series on population health analytics.

Two Minutes on Population Health Analytics
Two Minutes on The Importance of Trust in the Data
Two Minutes on Data Necessary for Population Health Analytics
Two Minutes on the Barriers to Integrating Population Health Data

Grant Hoffman
VP, Clinical Integration