Healthcare Analytics

Rx for Big Data: Use It Surgically

Getting Patient Records to Point to Better Care

Data Only Matters If Operations Improve, for Patients and Hospitals

The amount of data on record for healthcare is exploding, from 500 petabytes now to 25,000 petabytes by the turn of the decade.  Harnessing patient data and analyzing the data sets, however, opens the door for more cost-efficient and useful care. Mt. Sinai Hospital is working this sea of information to improve care and cut costs. 

Just off Central Park in New York, 160-year-old Mt. Sinai Hospital is trying to surgically dig in to huge mounds of patient data to improve care as well as streamline its operations. So far, the smart data initiative has cut readmissions rates for Medicare patients by 56 percent, improved doctors’ abilities to prescribe appropriate care for patients, such as making sure they’re not taking drugs that won’t fit with their genetic background and saved more than $20 million annually, along the way.

What’s at Stake:

Potential savings of $450 billion out of the nation’s $2.7 trillion annual expenditure on health. Improved patient care. More efficient hospitals.

Every night, the Mount Sinai Data Warehouse collects clinical, operational, and financial data from patient care rendered at Mount Sinai Hospital and by Mount Sinai Faculty Practice Associates.  The warehouse has done this since 2003, pulling in information on more than three million patients from 20 different transactional systems.

A supercomputer called Minerva, at the Icahn School of Medicine at Mount Sinai, combs through 1.5 petabytes of data to find answers, juggling 30 terabytes of information in its main memory.

Last year, Mount Sinai installed a $120 million system to consolidate medical records of its patients. This year, Mt. Sinai installed a new computing cluster, which joins a $3 million supercomputer nearby. According to them, they’ve committed more than $50 million to their data infrastructure, which includes a new 2,200 square foot facility, and another 70 teraflops of peak compute power.

 And the efforts of 150 scientists to make smart data out of big data is being led by  Jeff Hammerbacher, best known as Facebook’s first data scientist.

The effort, so far, is paying off.

With its new electronic medical records system, Mt. Sinai was able to reduce 30-day readmissions by 56 percent by evaluating both physical and chemical factors that drive patients to come back and adjusting care accordingly. They also saved more than $20 million in 2012, due in part to reduced costs in supplies and management.

“We can measure everything that’s going on now,” says Dr. Joel Dudley, director of Biomedial Informatics at Mt. Sinai. “We can embrace the complexity of human biology and disease, look at millions or billions of data points. How do we leverage all the tools we have available to build a 10-thousand foot view of human disease?”

That means not only looking at patient health records, but into the complicated overlay of biological and physiological data, including information around multiple genes, proteins, and other factors that could be important in diagnosing and understanding disease. Dudley’s work around diabetes takes a holistic look at the different variables such as exposure to a viral illness that can affect different parts of a population. The synthesis then can be used to help future patients.

Though not every hospital has Mt. Sinai’s research or approach (it is, after all, a medical school and research center, on top of operating as a working hospital), the benefits of harnessing big data and analytics for better health care could be enormous.

According to research by McKinsey & Company, big data could help save U.S. citizens anywhere from $300 billion to $450 billion each year. And they call those numbers “conservative,” in light of the potential innovation ahead. Mt. Sinai isn’t the only healthcare organization to see the positive outcomes from smart uses of the exploding amounts of data they can store.

At Kaiser Permanente, a new computer system called HealthConnect, which helps make data available across dispersed medical facilities, has helped save lives — in one study, Kaiser Permanente was shown to be able to reduce deaths related to coronary artery disease by 76 percent. Another recent study out of the University of Indiana found that using predictive modeling to predict the outcomes of different treatments could reduce healthcare costs by close to 60 percent, as well as improve patient outcomes by over 40 percent.

According to a survey of hospital and healthcare CIOs conducted by eHealth Initiative, almost  80 percent of respondents believe that big data and analytics are important to achieving their goals. But 84 percent also said that implementing the necessary tools for both poses a “significant challenge.”

 

The challenge is not to collect the petabytes of history on a patient’s symptoms, illness, broken bones, surgeries and the like. Rather, it’s to help create a diagnosis – not just for caring for that patient but patients with similar conditions at similar points in their lives. And then, put in place systems and procedures that doctors and other hospital staff can actually use during the course of their work.

“Silicon Valley tech people are trying to ‘disrupt’ health care,” says Dr. Joel Dudley, Director of Biomedical Informatics at Mount Sinai School of Medicine. “One big mistake is not appreciating what the day-to-day life of a doctor is like.”

And make it less, rather than more, disruptive.

  • Accommodate hospital workflow. “You want to integrate as much as possible into the current clinical decision support workflow,” says Dudley. “Doctors aren’t going to just use an iPad app. There’s no way in heck they’re going to just learn new systems and how to switch between them.” So they worked out a system that made sense for the way that doctors work, day to day. “Once you get the modeling right, make sure you implement it in a way that embeds into the workflow of the organization,” says Anita Krishnan, PhD, principal for strategic planning at the healthcare consulting firm, The Advisory Board Company. “Otherwise it’s a futile effort.” Krishnan recommends creating a “centralized data governance structure” or analytics committee that can organize operations around implementing data and analytics plans. “You don’t want to go after too much at one time,” she says. “It depends on the needs of the organization.”

  • Test it out to see what works. While implementing its new EMR system, Mt. Sinai used actors and mannequins as a part of a comprehensive change management plan to train hospital workers in a variety of potential scenarios before the system launched. “The right partner will work with [hospitals] as the tech is implemented to identify in great detail what the existing care processes are and how they will modify them,” says Krishnan. “Carving out a pilot project or a smaller phase to start with can identify how it’s going to manifest in the existing workflow. In our experience, this has been a tremendous component in hitting the ground running.” During the change management program at Mount Sinai, they discovered that the system in the operating rooms was difficult for users to understand. As a result, they called in support to create step-by-step guides.

  • Let someone else store the data – and set up the analtyics. Not every hospital can afford to operate like Mt. Sinai, which, as an academic hospital, is able to do things such as hire scores of data scientists. “One of the biggest challenges is the ability to afford new technology and new staffing,” says Cynthia Burghard, a research director with IDC Health Insights. “Hospitals are still caught up in implementations of electronic health records. There’s not a lot of investment in the raw technology stack.” Instead, Burghard says that for most hospitals, the best option for a tech upgrade is to turn to vendors such as Explorus or MedeAnalytics, which provide big data and analytics capabilities that can be used with existing infrastructure. “What hospitals care about is that they get access to new sources of data,” Burghard says. “It seems to be a much more digestible approach.”

  • Utilize existing infrastructure. “One big barrier is the current electronic medical records systems,” says Dudley. “A lot of them are not open in terms of the ability to interface with them. We need new types of software that could integrate with the EMR systems to bring more big data to the point of care.” At Mt. Sinai, a system called CLIPMERGE (Clinical Implementation of Personalized Medicine Through Electronic Health Records and Genomics) was developed to plug into the backend of their Epic-built EMR system. “It acts as a big data hub into the EMR system,” says Dudley. “It provides a way to monitor patient data and fire off messages to the clinician at point of care.” For example, if a doctor prescribes a medication and genetic records from the CLIPMERGE database show that the patient should not be on that drug because of similar genes, the physician can be apprised of that information as they’re making the decision.

  • Don’t try to “fish” for data. Try to be surgical, in what you pick out of the mountain of data. “A lot of people will equate data mining with a fishing expedition,” says Dudley. “People think you can just find all sorts of correlations. What these data analysis tools do is expand the scope of the question.” If you have a question about the subtypes of Type 2 Diabetes, and the genes that define them, for example, Dudley notes that in the past, you might have tried to look specifically at lipid regulation to find an answer. “What we can do now is look at every variable that can be measured in the population and understand the different types of sub groups that might exist, instead of restricting analysis to one gene.”

“We’re working towards a learning healthcare system,” says Dr. Joel Dudley, director of Biomedical Informatics at Mount Sinai Medical School. “We collect huge amounts of information on patients. Everything we know about these patients is being captured but not learned from. Why can’t we take all the data we’ve ever seen and learn from the evidence what works for patients and what doesn’t?

Imagine, for example, if every time a patient walked into a hospital they could take a blood test and have those results compared to a database of blood test results, treatment received and recovery pattern for every previous patient who’s come through that hospital?

“Of all the people we’ve seen, which ones do you look most like? What happened to those people? How did they react and how might you react?” says Dudley. “Can we make sure our healthcare system is smart enough to know which patterns are unique to our patient population?”

The roadblock is not the data. It’s what to do about it.

“By virtue of many different forces we have a tremendous amount of digital data that’s available today that enables us to do really powerful things,” says Anita Krishnan, PhD, a principal at The Advisory Board Company, a healthcare consulting firm. “We have a ton of data, all of it could be relevant, how do we figure it out?”

According to Krishnan, though many people have turned to machine learning programs that are designed to optimize data, when it comes to health, computers may not always have the best judgement.

“Using astute clinical judgment in defining which variables are important is critical before you apply advanced modeling techniques,” Krishman says, pointing especially to behavioral components, such as whether a patient is likely to follow post-visit instructions for care.  “You could lose the nuance otherwise. We’re trying to get insights into patient characteristics, clinical interventions, readmissions. You want to be able to hone in on the variables that enable us to point the finger at what we can do to make a difference”