Information is Different Now That You’re a Doctor


William Hersh, MD, Professor and Chair, OHSU
Blog: Informatics Professor
Twitter: @williamhersh

This blog posting is a reading assignment for Oregon Health & Science University (OHSU) medical students who will be attending a session I am leading in the Fundamentals block of their curriculum that introduces them to medicine and medical school. My goal for the session is to provide a high-level overview of the information-related issues and challenges they will deal with as physicians and in process introducing them to the field of clinical informatics.

Even though many medical students and physicians do not acknowledge it, information has always been a major focus of clinical practice [1]. Physicians have always spent a great deal of time with information, as evidenced by studies that describe time use of physicians. Even in the era before widespread use of computers in medical practice, physicians spent more of their time in “indirect” care of patients (50-67%), including reviewing results and performing documentation, than directly interacting with patient (15-38%) [2-7]. A more recent study of interns found that they spent nearly 40% of their time in front of a computer [8].

Likewise, physicians and medical students have been using health information technology (HIT) for decades. During this time, the role of HIT has changed dramatically from a useful tool for data access and occasional information retrieval to a ubiquitous presence that permeates healthcare and medical practice in myriad ways. Twenty-first century physicians face a much more information-intense world than their predecessors. The field that focuses on how information is acquired, stored, and used is called informatics, and when applied in medicine and other health-related disciplines is called biomedical and health informatics [9].

Why do physicians spend so much time dealing with information? One reason is that the quantity of biomedical knowledge continues to expand, with an attendant increase in the primary scientific literature, i.e., the 75 clinical trials and 11 systematic reviews published each day. [10] Secondary knowledge sources that summarize this information proliferate as well, not only for use by clinicians but also by patients and consumers. Medical knowledge no longer is the exclusive purview of physicians, as 80% of all Internet users have searched for personal health information [11].

Another major change in the use of HIT has been the rapid growth of electronic health record (EHR) adoption. As a result of the “meaningful use” financial incentives of the Health Information for Technology and Clinical Health (HITECH) Act [12, 13], there has been widespread adoption and use of the EHR, growing to 97% in hospitals [14] and 78% in physician offices [15]. Being able to use data and information also means understanding that the EHR is more than “charting,” and that its value goes beyond being able to read it. Clinicians must be facile with all aspects of the EHR, being able to easily move from one vendor system to another. They must also learn to take advantage of clinical decision support that aims to prevent errors in test ordering, prescribing, and other activities that improve diagnosis and treatment of patients [16].

There is also growing adoption of HIT by patients and consumers, who not only want to find information about their health and disease, but also desire to interact with the healthcare system the same way they interact with airlines, banks, and retailers, i.e., through digital means. Growing numbers of patients are participating in their care using technologies such as the personal health record (PHR) [17] and some are even accessing their own progress notes [18] and more [19].

In the meantime, those who purchase and pay for healthcare, along with patients, are demanding more accountability for the quality, safety, and cost of care [20, 21]. This has led to an expectation of measurement and reporting of healthcare quality of care as a routine part of participation in new care delivery mechanisms such as primary care medical homes and accountable care organizations [22]. Likewise, there is a growing application of the data analytics to improving healthcare [23].

Patients may receive care in different places, sometimes by choice but other times by circumstances beyond their control, such as emergencies. Ideally data should “follow the patient” and move readily across organizational boundaries via health information exchange (HIE) [24]. At the same time, telemedicine and telehealth applications extend the reach of healthcare systems and clinicians in both rural and urban settings [25].

The growing quantity of clinical and administrative data in clinical information systems also affords an opportunity for advanced analysis that can enable better deployment of resources and coordination of care, facilitate personalized and precision medicine, and advance clinical and translational research [26]. Together, these advances are moving healthcare toward the global vision of the learning health system, where information systems are used to capture our practice, analyze what we might have done correctly or improperly, and guiding our improvement [27, 28].

Further evidence for importance of these developments comes the recent establishment of the new medical subspecialty of clinical informatics [29]. Practicing physicians are now beginning to become board-certified in this new subspecialty, with the concomitant establishment of fellowship programs accredited by the Accreditation Council for Graduate Medical Education (ACGME). A growing number of physicians hold titles such as Chief Medical Informatics Officer (CMIO).

It should be abundantly clear that information becomes very different when one transitions to becoming a healthcare professional. There are professional and legal expectations that clinicians must acquire, analyze, and evaluation different facets of information to provide the best possible care to individual patients and entire populations. One critical concept is that informatics is not the same as computer literacy. Computer literacy is one of many requirements to use informatics successfully, but knowing how to use a computing device (PC, tablet, or smartphone) is not the same as having skills in informatics, i.e., using that device to improve health, healthcare delivery, public health, or research.

As a physician, information is different in many ways. Critical decisions about patient care are based on information not only in their EHR, but also knowledge retrieved from scientific literature, textbooks, Web sites, and other sources. Information must be accurate, up-to-date, and applied properly. Physicians must also be effective stewards of a patient’s record, both in terms of keeping it accurate and up-to-date, and also doing the utmost to make sure it is kept private and secure.

This article post first appeared on The Informatics Professor and the references can be found in the original post. Dr. Hersh is a frequent contributing expert to HITECH Answers.