Every 50 years, there is a revolution in healthcare based on the trends of the era. In the 1870s, healthcare was revolutionized by the germ theory of disease and promotion of public health efforts. In the 1920s, the discovery of penicillin propelled forward the use of medication as treatment for disease. In the 1970s, use of the randomized controlled trial (RCT) ushered in an era of evidence-based medicine. As we approach the 2020’s, the trend toward big data, tools and systemization of care will revolutionize the way hospitals and physicians work and, most importantly, the way patients are treated.
Big data refers to a set of information and data so large and complex that it becomes difficult to process using conventional database management tools. At issue is how to access, distribute and utilize this vast amount of “unstructured” data. For patients, clinicians and hospitals that have massive amounts of clinical content in electronic health records (EHRs) that remains unused, the implications can be rising mortality rates and out-of-control medical costs.
Let’s consider the current vanguard of data-driven healthcare in hospitals. At the best institutions, doctors and nurses are going room to room each day to mark down which patients meet which quality metrics and whether they’re addressed. The result is a manually-entered, cumbersome flow chart that can, at best, address a handful of the hundreds of known quality measures and use limited data to address these. With a condition like deep-vein thrombosis for example, hospital staff relies on manual calculations to assess the risk of a patient. The problem is, if not treated properly, mortality rates rise. The real tragedy is that the information needed to properly assess the patient’s risk and determine treatment is available in the clinician’s notes, but without the proper tools the knowledge remains unavailable and hence, unused.