Specialties & Topics
- Arthritis/Rheumatic Disease
- Breast Cancer
- GERD/Peptic Ulcers
April 5th, 2014
Paul Bergl, M.D.
In my transition from pure learner (i.e., the med student role) to teacher-learner (i.e., the attending), I’ve actually found myself focusing more on the learner than the teacher part of my dual existence. Strong learning seems to be requisite to strong teaching, and I am realizing that succeeding on the next level requires some extra meta-cognition, that is, learning to learn in new ways.
Learning to Unlearn
In med school, learners amass an incredible amount of new information and master a completely new language. Suffice it to say that “drinking from the firehose” probably understates the reality of undergraduate medical education.
Our schools inculcate lots of so-called “facts” into our students’ fresh minds, and said students suck up these facts like infinitely absorptive sponges. Sure, students often purge the data after cramming for tests, but they inevitably reclaim much of this knowledge over the next several years. And thus, students graduate medical school with their minds encumbered by extraordinary amounts of information to apply to patient care.
This approach unfortunately is faulty in two respects: Memory is imperfect, and facts are not immune to mutability. I have been caught on rounds reciting “facts I learned in medical school” only to have my team discover that almost no reputable sources can corroborate my claims — or even worse, that a reputable source completely refutes them. I cannot always pin down the etiology of my misinformation. I usually blame time’s effect on the faulty memory compartment. More importantly, I make a mental note to condemn that parcel of my brain and vacate it for future use.
My advice: Actively seek out the misinformation in your brain, and purge it. Identify what you thought you’ve learned but isn’t true.
Learning to Get Answers Without the Certainty of an Answer Key
As learners progress through their undergraduate and graduate training, they move from the black-and-white world of correct answers to a landscape of gray zones devoid of an answer key. Students often live and die by “what’s going to be on the test.” Even residents living in the oft-ambiguous world of clinical medicine have some anchor of certainty: the attending’s final word. No matter what shade of correct or incorrect a clinical decision is, the resident can often fall back on what the attending will want.
When there is no longer a judge of correctness — be it the professor, the course director, or the attending – it can be quite unnerving. In this situation, the teacher-learn should remember that “facts” you’ve learned are never truths. They are half-truths with varying degrees of evidence that can be variably applied to actual clinical scenarios.
My advice: When faced with situations where a correct answer cannot be known, gather all the information you can to make an informed decision. Remember that the teacher-learner becomes a de facto answer key, but be ready to adjust the grading rubric too.
Learning to Learn Critically
The learner role affords a certain luxury to students and residents: leaving all the critical thinking to the experts. For example, when you rotate on your cardiology rotation, you must recite gospel verses like, “Give an ACE inhibitor and β-blocker for everyone with reduced ejection fraction.” But what about that latest study on angiotensin receptor neprilysin inhibitors? Well, you get to leave the interpretation and its application to real-life settings to the renowned cardiology attending.
I am not saying that residents aren’t expect to think critically. But they often don’t have the time to learn critically: to analyze the latest developments and consider how to integrate the evidence into practice. Instead, students and residents defer or default to the experts (and integrate this information as “facts” into their brain; see sections above).
Now imagine what happens when no expert is present. The teacher-learner needs to be prepared to face situations in which he or she might be the one distilling very complicated data into spoon-fed “pearls.” The teacher-learner also needs to decide how much stock to put into his or her own truths.
My advice: Imagine how you would apply newly acquired knowledge to your patients before actually doing so. Someday you will not have an expert to lead the way, and you never know when your trainees might look to you for guidance.
February 11th, 2014
Paul Bergl, M.D.
Recently, our residency program had the excellent fortune of hosting Dr. Bob Wachter as a visiting speaker. Dr. Wachter is considered a pioneer in the hospitalist movement and has built his career around inpatient quality and safety. During lunch with Dr. Wachter, some of our residents, and hospitalist faculty, we discussed the topic of resident autonomy in the hospital. In the glory days of residency, I imagine that house officers experienced autonomy in its truest sense of the word: self-rule and utter independence. At least that’s the impression I have from Stephen Bergman’s (a.k.a Samuel Shem’s) House of God … a Lord of the Flies–like island inhabited by unsupervised and uninhibited junior physicians.
We all know that the 21st century’s inpatient environment leaves less room for such resident independence — and shenanigans, for that matter. Through regulatory and advisory bodies, patient advocacy groups, and our own recognition, we are now rightly focusing on other domains of hospital care. The main priority is not to “just let the doctors take care of patients how they want.” With this sea change, resident autonomy has evolved accordingly — both in practice and as a concept.
In case you haven’t heard, the quality and safety era is here to stay. Because true autonomy and “learning by doing” can potentially stand squarely at odds with quality metrics and the safest possible outcomes, I have to wonder:
- Will resident autonomy disappear completely in the future?
- How has autonomy changed already?
- And how will trainees learn to practice independently with all of this change?
Anyone that works in graduate medical education knows that duty-hour reform fundamentally shook the resident learning experience and thus affected autonomy. Since the additional 2011 duty hours changes were enacted, teamwork has become the name of the game. Residents routinely are forced to pass off decisions or depart before the implications of their decisions materialize. Attending physicians now seemingly shoulder more of the clinical workload, too, when residents’ shifts are truncated by a requirement to leave the hospital.
There is also probably universal agreement that the imperatives to reduce hospital length of stay and to facilitate safe discharges affect resident independence. As an attending, I know the pressures that the hospital is under, and I often feel compelled to be very directive about making prompt discharge a reality.
I doubt that residents would feel that either of these changes has eliminated their autonomy completely. Often in medicine, there is no single correct way to achieve an end. For this reason, residents can still safely be given leeway in many clinical decisions. There is still some art in what we do, and autonomy lives to see another day. But there are looming changes on the horizon that might threaten resident autonomy even more. Those that spend time in the hospital training environment recognize that all participants in inpatient care will increasingly be measured by how well they do their jobs. It is hard to conceive of a system in which residents and the attendings that supervise their care will be spared from aggressive quality improvement.
The stakes are simply too high nowadays. Health care is expensive and still unacceptably unsafe and unhelpful. With mounting pressures to provide higher value care, residents’ decisions are likely to undergo more scrutiny. Why is Resident A ordering more CT scans than Resident B? Why are Resident C’s patients staying in the hospital 2 days longer than Resident D’s? In my own experience as a chief resident, I know that residents still want autonomy. Heck, autonomy was my top priority in evaluating residency programs myself. As I have interviewed and met a number of applicants to our program during this residency match season, I sense that soon-to-be trainees are also putting autonomy high on their list of values. I like to tout my program’s emphasis on autonomy. I try to foster resident growth while attending on service by relaxing the reigns. I know the term autonomy doesn’t mean what it once did. Yet I do have hope that we can still give residents the leeway to “learn by doing” while preserving the health of the patients we serve and the financial stability of our healthcare system and our country.
January 13th, 2014
Akhil Narang, M.D.
During my year as a Chief Resident, I have the privilege to attend on the general medicine service for 8 weeks. I recently completed 4 weeks and, as expected, found myself in an entirely new realm of patient care and accountability. I would be remiss without recalling a few of the pivotal lessons and poignant moments that stand out.
Transitioning from resident to attending inevitably results in greater scrutiny. Despite my best efforts to prevent readmissions (especially within 30 days), I had several during my first month. A cirrhotic patient with a recurrent variceal bleed, a patient with sickle cell disease readmitted for a vaso-occlusive crisis after a sharp overnight temperature drop, and an older nursing home resident treated for a UTI who came back for seizures. Given the mounting pressure to prevent readmissions, I spent numerous hours dissecting the chart for each patient attempting to understand what went wrong and what I could have done differently. I discussed the cases with my co-Chiefs and several senior attendings. The consensus was that, in many cases, readmissions will happen. This was obvious to me as a resident but now, as an attending (especially an attending on service for the first time), I felt I had done something wrong. The increased scrutiny, coupled with a heightened sense of self-reflection, led me to forget what I learned over my years of residency — sick patients tend to get readmitted.
The ideal teaching service affords everyone the opportunity to teach. At the helm of a large team (one resident, two interns, two students, and a pharmacist), I did my best to demonstrate ultrasound IVC measurements in a hypotensive patient with heart failure before giving fluids, pointing out Quincke’s sign in aortic regurgitation, and reviewing sodium homeostasis in a patient with hypernatremia. For the first few weeks I was so concerned about being a good teacher that I neglected to be a student. Our team had admitted a patient who was struggling to breath and with newly diagnosed interstitial lung disease when my student gave a brilliant, unprompted presentation on the etiologies of ILD on rounds. Only then did I remember that my students, interns, and residents all know things I don’t. Giving them the opportunity to teach me is vital and surely won’t be forgotten.
As a resident, I took pride in efficiency. Suspicious lung mass in a smoker’s chest x-ray? No problem — I could coordinate the CT scan, bronchoscopy, and pulmonary function tests the same day. My seniors hammered into me that disposition is the goal. The longer the work-up takes, the longer the patient stays in the hospital. When fixating on the total patient census, it’s easy to neglect practicing good internal medicine. As an attending, while I respect the differences between work-ups of inpatient and outpatient problems, I also realize it’s ok to adjust asthma medications, initiate treatment for GERD, or talk about depression in a patient awaiting placement for hip fracture.
Attending on the general medicine wards has been one of the most rewarding, fun, and challenging experiences of my short academic medical career. I’ve learned too many lessons to enumerate, but perhaps the most important of all is to not lose focus on the foundation I built during residency.
January 8th, 2014
Paul Bergl, M.D.
At first glance, no diagnosis seems more terrible than cancer. Although it remains a huge killer in the developed world, cancer has also taken on new meanings in modern medicine. As an ordinary person, I certainly fear the word and would dread the diagnosis. Cancer. It has such a damning and unforgiving ring to it. After 3 years of residency in a tertiary referral center, where I’ve seen some of the worst cases conceivable, I still cannot imagine the painful and devastating odyssey that those who succumb to it must endure.
As a recently minted physician though, I fear cancer for other reasons. The science of the field is moving at a blistering pace. How can I keep up on the state-of-the-art treatments, genomic-based diagnostic tools, and molecular therapies? (When I talk about modern cancer care, I often wondering if I even talking about things that really exist.)
The care of cancer patients discourages this generalist, because it has become exceedingly complicated. How do I craft my words to distinguish “cancer” from “pre-cancer”? What advice do I give to a patient with recent biopsy-proven, localized prostate cancer? Will I be sued for negligence I didn’t offer chemoprophylaxis for breast cancer in a patient who develops metastatic disease on my watch? How can I watch expensive third-line chemotherapy being given to one of my patients while another patient eats his way to a cancer-causing BMI of 40 on a low-cost, high-carb diet?
Given these questions, I thought I would begin 2014 with a reflection on what cancer means to the general practitioner.
Cancer as Preventable Disease
Despite all the advances we have made in diagnosing and treating cancer, we still face awesome opportunities to curtail cancer before it even starts. During the past several decades, we have clearly made strides in preventing cancer, particularly in the realm of curtailing tobacco use. (Then again, tobacco use rates aren’t really all that different than they were 10 years ago.) And, all the while, our nation is growing increasingly obese — so much so, that obesity threatens to overtake tobacco as the major preventable cause of cancer.
Given these trends, I sense that progress toward preventing cancer has stalled. I also wonder if enough clinicians are even considering the fact that cancer is preventable at all. When I give the lifestyle pep talk in clinic, I am usually warning patients about risks for developing cardiovascular disease or diabetes, not cancer. I also feel somewhat powerless to affect a patient’s ability to avoid cancer through lifestyle interventions.
These days, we need continued dedication to training physicians to coach patients about lifestyle improvements. We also must bridge the divide between medical providers and our public health leaders and find more creative solutions than exploding cigarette taxes or rehashing ideas about food deserts, fat taxes, and junk food advertisements.
Besides preventing cancer by recommending lifestyle adjustments, the generalist must also augment his use of chemoprophylaxis when indicated. For example, even though the USPSTF reaffirmed its grade B rating for chemoprevention of breast cancer in high-risk individuals in 2013, most of us don’t adhere to these guidelines very stringently (NEJM JW Womens Health Apr 8 2010), especially compared with our adherence to other grade B recommendations, like mammography. We will have even more options as aromatase inhibitors emerge as chemoprevention, so we generalists will need to keep up to speed in this field. Of course, we might be able to use less targeted chemopreventive techniques, like aspirin for colorectal cancer and will need to know the risks and benefits of these options, too.
Less Screening and More Expectant Management of Cancer
Although oncologists might argue that “targeted therapy” or “pharmacogenomics” are the buzzwords that describe the future of cancer care, my own generalist-biased ears hear “overdiagnosis” everywhere. Most clinicians probably think of indolent prostate cancer and the PSA debate when they hear this term, but plenty of buzz surrounds overdiagnosis for other reasons. Part of the issue is the desire to redefine clinical entities that have often come with the bleak label of “cancer.” For example, the debate over DCIS has shifted from how to treat it to how we even describe it to patients. And, clearly, what we call DCIS does matter.
We also have new screening modalities that have generated excitement, such as the USPSTF and American Cancer Society’s endorsement of low-dose chest CT for lung cancer. Clinicians must remain circumspect about use of this screening tool though, as chest CT itself can reveal countless false positives and also carries serious risk for overdiagnosis. And, like the PSA/prostate cancer debate I’ve seen unfold over my training career, low-dose chest CT can lead to expensive, debilitating, and potentially deadly complications from biopsies and excessive cancer treatment.
All of this talk of overdiagnosis also makes me wonder where the medical community will draw the line on whom to screen. I wonder how willing the public will be to accept expectant management as a treatment option. The American Cancer Society already has published patient information for managing prostate cancer expectantly, but how often will patients with something more deadly — say, lung cancer — opt for “just watching it”?
Cancer at the Crux of the Medical Economics Arguments
Finally, all of these cancer-related issues are bound to intersect at the most timely of all topics in medicine: cost-effective care. That cancer care is extremely expensive is no secret. Thus, we will need to be more selective in our use of cancer treatment modalities. Will our payers begin to curb use of treatment modalities that do not confer a defined benefit for their cost, such as radiotherapy for prostate cancer? And on the question of cost-effective screening, will we continue to find more cost-effective ways to identify cancer early (like HPV testing every 5 years for detecting cervical cancer)?
Cancer is no longer the ultimate evil that must be detected early and destroyed at all costs. I don’t know that it ever was, but I do know that decision-making around prevention, detection, and treatment of cancer has become more nuanced than ever before.
December 11th, 2013
Paul Bergl, M.D.
As Dr. David Green reported this week in NEJM Journal Watch, the American Society of Hematology is the latest society to comment on appropriate and cost-conscious care in the ABIM Choosing Wisely campaign. I’ve followed the Choosing Wisely campaign closely and have been using it on the wards and in clinic as academic ammunition. A specialist society’s public advice about showing restraint is an excellent means to challenge the dogma of our so-called routine practices.
I know every conscientious practitioner has struggled with the high price of medical care. Our training environments are currently breeding grounds — and battlegrounds, for that matter — for ideas on how to solve our nation’s cost crisis. I have often wondered how we might change the way we train our residents and teach our students to exhibit financial diligence.
Of course, we are all part of this economic mess, and residents rightly share some of the blame. As naïve practitioners who lack confidence in diagnosis and management, residents tend to overorder and overtreat. I certainly have checked a thyrotropin (TSH) level in the inexplicably tachycardic hospitalized patient, despite my own knowledge that it was probably worthless. And I’ve seen colleagues get echocardiograms “just to make sure” they could safely administer large amounts of IV fluid for hypovolemic patients with hypercalcemia or DKA. When residents don’t have years of experience, they use high-tech diagnostic testing as a crutch.
Then again, the expectations of the learning environment also contribute to the epidemics of excessive echocardiograms and needless TSH levels. First of all, trainees are expected to have their patients presented in neat little bundles, devoid of any diagnostic uncertainty. Additionally, they have been trained through years of positive reinforcement for broad differential diagnoses and suggesting additional testing for unsolved clinical problems.
Although the Choosing Wisely campaign speaks to me and many of my generation, it is only a start. It alone cannot stand up to the decades of decadence and our culture of waste. How can we encourage trainees to truly choose wisely in the training environment? I propose the following:
- Deploy pre-clinical curricula that emphasize value-based medical decision-making. As much as students lament the breadth and depth of their curricula, pre-clinical students have fresh, open minds and are actually receptive to learning about cost-consciousness. We cannot expect that the curricula in residency or CME efforts will have an effect on our cost-ignorant model of care.
- Include cost-conscious ordering and prescribing in our board examinations. I have seen some change from when I took the USMLE Step 1 in 2008, but I notice that clinical board questions still usually ask for a “best next step” that usually doesn’t include “expectant management” as an option. As trainees prepare for these exams, they develop a line of thinking that then permeates clinical practice. When patients with chronic musculoskeletal complaints and unremarkable radiographs are referred for MRIs rather than receiving reassurance, we can put some of the blame on our licensing exams.
- Reward trainee restraint. Residents and students should be commended for not working up insubstantial problems, withholding unnecessary treatments, and showing prudence in choosing diagnostics. Again, our educational constructs are to blame, because we reward expansive thinking and “not missing” things. In morning reports and other case conferences, we often praise residents for adding another diagnostic possibility rather than exhibiting “diagnostic restraint” or cost-conscious care.
- Give trainees some sense of the cost and price of tests and treatments. The literature has not consistently shown that giving physicians cost or price information will prevent wastefulness. But as far as I know, these studies have focused on clinicians in practice who are wedded to their ways. From my experience, trainees thirst for this type of information. Frankly, we are all clueless about how much a chest CT costs. How much was the machine? Are there separate bills for the scan and for the radiologist’s interpretation? How much is the patient expected to pay? What will insurance pay?
- Get leadership buy-in at academic centers. I am neither a healthcare economist nor a chief financial officer. But my experience as a chief resident has taught me that buy-in from the academic leadership is necessary to turn the tide on monumental tasks.
November 25th, 2013
Paul Bergl, M.D.
Nothing puts more fear into the heart of an internist than a dermatologic chief complaint. And for good reason: we have very little exposure to the breadth of the field. To us, all rashes seem to be maculopapular, all bumps are pustules… or was that nodules?
It’s not that we internists don’t care about the skin or don’t appreciate its complexity. Rather, we simply haven’t seen enough bumps, rashes, and red spots to sort them all out consistently.
On the topic of pattern recognition in medicine, an oddly titled NEJM Journal Watch piece Quacking Ducks grabbed my attention recently. The commentary by Mark Dahl summarizes a J Invest Dermatol article by Wazaefi et al. that discusses the pattern identification and other cognitive processes involved in discerning suspicious nevi. I will try to distill the interesting discussion to the main points of Dr. Dahl’s summary and the index article:
- Experienced dermatologists use other cognitive processes besides the “ABCD” method for finding suspicious nevi.
- Most healthy adults have only two or three dominant patterns of nevi on their bodies.
- Deviations from the patient’s own pattern usually represent suspicious nevi. These deviations are referred to as “ugly ducklings.”
- Even untrained practitioners can cluster nevi based on patterns and can identify which nevi deviate from the patterns.
- However, expert skin examiners tend to cluster nevi more reliably and into a smaller number of groups.
- Identifying potential melanomas by seeking out “ugly duckling” nevi is both an exceedingly simple and cognitively complex means of finding cancer.
So, what is the take-home point? To make diagnoses, dermatologists use their visual perception skills, some of which are innate and some of which are honed through practice. While technology threatens to overtake the task of perception — see the MelaFind device, for example — human perceptiveness is still difficult to qualify, quantify, and teach.
A colleague of mine and a faculty radiologist at my institution, David Schacht, has pondered the very question of visual perceptiveness among trainees in his own specialty of mammography. As your probably realize, computer-aided diagnosis has risen to prominence as a way to improve radiologists’ detection of subtle suspicious findings on mammograms. These computerized algorithms lessen the chance of false-negative tests. However, a radiologist ultimately still interprets the study; as such, radiologists still need training in visual perception. But how does a radiologist acquire this “skill”? Dr. Schacht hypothesizes that radiology residents who review a large series of enriched mammograms will have better cancer-detection rates. In other words, he hopes that intensive and directed case review will improve visual perception.
Clearly, mammographers and dermatologists are not alone in making diagnoses by what they see. Every field relies on some degree of astute observation that often becomes second nature over time. Even something as simple as the general appearance of a patient in the emergency room holds a trove of clues.
My question is, can these perceptive abilities be better taught to trainees or even be programmed into a computer? Or should we simply assume that experience itself drives some otherwise unexplained improvement in visual diagnosis?
If the former is true, then we ought to seek a better understanding of how physicians glean these skills. If man should stay ahead of machine, then we clinicians should hone our intuition and our abilities to recognize visual patterns. Moreover, we should design education systems that promote more visual engagement and activate the cortical pathways that underpin perceptiveness.
On the other hand, if experience itself imbues clinicians with better perceptive skills, then we really ought to maximize the number of clinical exposures for our trainees. No matter what the field, students and residents might simply need to see a lot more cases, either simulated or real.
As human or computer perceptiveness evolves, even the most expert eyes or finest computer algorithms will still be limited . And ultimately, any homely duckling of a nevus probably deserves a biopsy. But with biopsies, aren’t we trading one set of expert eyes for another — in this case, the pathologist — when we send that specimen to the lab?
In the end, the prevailing message seems to be that repeated experiences breed keen and reliable observations. We cannot discount the very basic human skills of visual cues. We should continue to seek ways to refine, study, and computerize our own perceptiveness.
November 14th, 2013
Paul Bergl, M.D.
This past week, I attended the annual AAMC meeting where the question, “What will medical education look like in 2033?” was asked in a session called “Lightyears Beyond Flexner.” This session included a contest that Eastern Virginia Medical School won by producing a breath-taking and accurate portrayal of 2033. I encourage you to view the excellent video above that captures the essence of where academic medicine is headed.
After this thought-provoking session, I too pondered academic medicine’s fate. I would like to share my reflections in this forum.
Without question, technology stood out as a major theme in this conference. And for good reason: clearly it is already permeating every corner of our academic medical lives. But as technology outpaces our clinical and educational methods, how exactly will it affect our practices in providing care and in training physicians?
Our educational systems will evolve in ways we cannot predict. But in reality, the future is already here as transformations are already afoot. MOOCs — massive open online course for the uninitiated — like Coursera are already providing higher education to the masses and undoubtedly will supplant lectures in med schools and residencies. In a “flipped classroom” era, MOOCs will empower world renowned faculty to teach large audiences. Meanwhile, local faculty can mentor trainees and model behaviors and skills for learners. Dr. Shannon Martin, a junior faculty at my institution, has proposed the notion of a “flipped rounds” in the clinical training environment, too. In this model, rounds include clinical work and informed discussions; reading articles as a group or having a “chalk talk” are left out of the mix. In addition, medical education will entail sophisticated computer animations, interactive computer games for the basic sciences, and highly intelligent simulations. Finally, the undergraduate and graduate curricula will have more intense training in the social sciences and human interaction. In a globalized and technologized world, these skills will be at a premium.
But why stop at flipped classrooms or even flipped rounds? Flipped clinical experiences are coming soon too.
Yes, technology will revolutionize the clinical experience as well. Nowadays, we are using computers mainly to document clinical encounters and to retrieve electronic resources. In the future, patients will enter the exam room with a highly individualized health plan generated by a computer. A computer algorithm will review the patient’s major history, habits, risk factors, family history, biometrics, previous lab data, genomics, and pharmacogenomic data and will synthesize a prioritized agenda of health needs and recommended interventions. Providers will feel liberated from automated alerts and checklists and will have more time to simply talk to their patients. After the patient leaves the clinic, physicians will then stay connected with patients through social networking and e-visits. Physicians will even receive feedback on their patient’s lives through algorithms that will process each patient’s data trail: how often they are picking up prescriptions, how frequently they are taking a walk, how many times they buy cigarettes in a month. And of course, computers will probably even make diagnoses some day, as IBM’s Watson or the Isabel app aspire to do.
Yet even if Watson or Isabel succeeds in skilled clinical diagnosis, these technologies will not render physicians obsolete. No matter how much we digitize our clinical and educational experiences, humans will still crave the contact of other humans. We might someday completely trust a computer to diagnose breast cancer for us, but would anyone want a computer to break the bad news to our families? Surgical robots might someday drive themselves, but will experienced surgeons and patients accede ultimate surgical authority to a machine? A computer program might automatically track our caloric intake and physical activities, but nothing will replace a motivating human coach.
With all of these changes, faculty will presumably find time for our oft-neglected values. Bedside teaching will experience a renaissance and will focus on skilled communication. Because the Google Generation of residents and students will hold all of the world’s knowledge in the palm of their hands, they will look to faculty to be expert role models. Our medical educators will be able to create a truly streamlined, ultra-efficient learning experience that allows more face-to-face experiences with patients and trainees alike.
So where is academic medicine headed beyond Flexner? Academic physicians will remain master artists, compassionate advisers, and a human face for the increasingly digitized medical experience.
November 7th, 2013
Paul Bergl, M.D.
In my 3 years of residency, the nearly universal resident response to outpatient continuity clinic was a disturbing, guttural groan. I recognize that many aspects of primary care drag down even the most enduring physicians. But I have also found primary care — particularly with a panel of high-risk and complex patients — to be a welcome challenge. I recently spoke with one of my institution’s main advocates for academic primary care, who I know has wrestled with this standard resident reaction.
And we had a shared epiphany about the one of the main deterrent driving promising residents away from primary care: inadequate training in prioritizing outpatient problems.
It’s easy to see how primary care quickly overwhelms the inexperienced provider. An astonishing number of recommendations, guidelines, screenings, and vaccinations must compete with the patient’s own concerns and questions. This competition creates an immense internal tension for the resident who knows the patient’s needs — cardiovascular risk reduction, cancer screenings, etc. – but is faced with addressing competing problems.
At my institution, the resident continuity clinic also houses a generally sicker subset of patients. Take these complex patients and put them in the room with a thoughtful resident that suffers from a hyper-responsibility syndrome, and you get the perfect mix of frustration and exhaustion.
There are diverse external pressures that also conspire to make the trainee feel like he should do it all: the attending physician’s judgment, government-measured quality metrics, and expert-written guidelines for care.
In this environment of intense internal and external pressures, how can we give residents appropriate perspective in a primary care clinic?
I would argue that residency training needs to include explicit instruction on how to prioritize competing needs. Maybe residents intuitively prioritize health problems already, but I’ve witnessed enough of my residents’ frustrations with primary care and preventive health. Let’s face it: there is probably a lot more value in a colonoscopy than an updated Tdap, but we don’t always emphasize this teaching point.
At times, perceptions of relative value can be skewed. “Every woman must get an annual mammogram” is a message hammered into the minds of health professionals and lay people. Yet counseling on tobacco abuse might give the provider and patient a lot more bang for the buck. Our medical education systems do not emphasize this concept very well. Knowing statistics like the number needed to treat (NNT) might be a rough guide for the overwhelmed internist, but the NNT does take into account the time invested in clinic to arrange for preventive care or the cost of the intervention.
Worse yet, there are simply too many recommendations and guidelines these days. Sure these guidelines are often graded or scored. But as Allan Brett recently pointed out in Journal Watch, many guidelines conflict with other societies’ recommendations or have inappropriately strong recommendations. Residents and experienced but busy PCPs alike are in no position to sift through this mess.
Our experts in evidence-based medicine need to guide us toward the most relevant and pressing needs, guidelines about guidelines, so to speak. We need our educational and policy leaders to help reign in the proliferation of practice guidelines rather than continuing to disseminate them.
Physicians in training have their eyes toward the potential prospects of pay-for-performance reimbursements and public reporting of physicians’ quality scores. No one wants to enter a career in which primary care providers are held accountable for an impossibly large swath of “guideline-based” practices.
So how might we empower our next generation of physicians to feel OK with simply leaving some guidelines unfollowed? In my experience, our clinician-educators contribute to the problem with suggestions like, “You know, there are guidelines recommending screening for OSA in all diabetics.” Campaigns like “Choosing Wisely” represent new forays into educating physicians on how to demonstrate restraint, but they do not help physicians put problems in perspective. Our payors don’t seem to have any mechanism to reward restraint or prioritization and can, in fact, skew our priorities further.
I hope that teaching relative value and the art of prioritizing problems will be a first and critical step toward getting the next generation of physicians excited about primary care.
October 3rd, 2013
Paul Bergl, M.D.
“What do you think, Doctor?”
For a novice physician, these worlds can quickly jolt a relatively straightforward conversation into a jumble of partially formed thoughts, suppositions, jargon, and (sometimes) incoherent ramblings. Even for simpler questions, the fumbling trainee does not have a convenient script that has been refined through years of recitation. Thus, many conversations that residents have with patients are truly occurring for the first time. And unfortunately, this novelty can result in poorly chosen words that can have lasting effects. An inauspicious slip of the tongue could significantly alter the patient’s perceptions and decisions.
A recent study in JAMA Internal Medicine highlighted the connection between the words we choose and the actions our patients take. As Dr. Andrew Kaunitz reported in a summary of this article for NEJM Journal Watch, avoiding the word “cancer” in describing hypothetical cases of DCIS resulted in women choosing less aggressive measures and avoiding procedures like surgical resection. Dr. Kaunitz notes that the National Cancer Institute has also recommended against labeling DCIS as “cancer.”
I believe this recent study raises a major question at every level of medical education, particularly during residency. How do we counsel our physicians to counsel patients and communicate effectively? How do we choose words that appropriately convey the urgency of a diagnosis without scaring our patients into unnecessarily risky treatments?
I was fortunate to have gone to a medical school where our course directors were forward-thinking. Our preclinical curriculum included mentored sessions with practicing physicians and patient-actors. We were taught how to use motivational interviewing to facilitate smoking cessation, how to ask about a patient’s sexual orientation and gender identification, and even how to use the SPIKES mnemonic for breaking bad news.
Yet all of these simulations can never really prepare a trainee for the first time he or she is on the spot in the role of a real physician with a real patient. Counseling patients requires subtleties; no medical school curriculum could possibly address every situation. But as the above-referenced study by Omer at al. confirms, subtleties and shades of meaning matter.
Of course, not every situation that requires measured words will be so dramatic. The words we choose for more benign situations matter, too. How do I define a patient’s systolic blood pressure that always runs in the 130-139 mm Hg range? Do I tell him that he has “prehypertension,” or do I give him a pass in saying, “Your blood pressure runs a little higher than normal”? Would this patient exercise more if I choose “prehypertension”?
What do I say to my patient with a positive HPV on reflex testing of an abnormal Pap smear? “Your Pap smear shows an abnormality” or “an infection with a very common virus…” or “suspicious cells” or “an infection that might lead to cancer someday…” What effect will it have on how she approaches future abnormal results? How might she change her sexual habits? How might she view her partner or partners?
These days, med schools and residency programs are being increasingly taxed by the competing priorities in educating the 21st-century physician. As technology rapidly threatens to usurp our expertise in many domains of practice, communication will remain a cornerstone of our job. Communicating with patients will never be passe; no patient wants to be counseled by a computer. We must master these vital communication skills.
Unfortunately, studies like Omer et al. only touch the tip of the iceberg. How do we empower patients with information while not scaring them? When should we scare them with harsher language? How and where do we learn to choose the right words? And who decides?
Even a perfectly designed training system could not tackle all these questions. But perhaps we ought to be exposing our trainees to simulated conversations in which subtle word choices matter. Let’s face it: We could all do better at choosing our words wisely.
September 27th, 2013
Akhil Narang, M.D.
Discussions of resident duty hour reforms reached the point of ad nauseam a few years ago. Everyone had their say – Program Directors (“In 2003 we instituted an 80-hour work week, in 2011 we switched to 16 hour shifts, what’s next – online residencies!?”), senior residents (“What? I have to write H&Ps again? I don’t even know my computer password!”), interns (“I thought I was done with cross-covering after this year”), graduating medical students (“I get to sleep in MY bed most of next year!”), and various supervising bodies (“This is what the public wants. Of course there is evidence that these reforms will work.”). Now it’s my turn: part of the last class to have experienced 30-hour call cycles as interns – the way it should/shouldn’t be (depending on your bias).
While lamenting to my Program Director during residency on how my class not only had a difficult intern year but also had to assume “intern responsibilities” during my junior and senior years, he gently reminded me of his experience as an intern. It was routine for him to care for more than 20 patients on the general medicine service. Moreover, the ICU was “open” and any of his patients transferred to the unit continued to be under his care. Generously assuming 1 day off in 7, he worked more 100-hour work weeks than he’d care to remember.
As a junior resident, I was on service with my Chair of Medicine and he repeated many of the same stories of busy services and how the word housestaff came to be – the residents’ de facto house was the hospital. Was this dangerous? The unfortunate case of Libby Zion (and others) would suggest yes. Did my attendings became outstanding physicians, in part because of the rigorous training? Unequivocally.
Fast forward a few decades: for numerous reasons, including public pressure, an 80-hour work weeks with a maximum of 30 consecutive hours in-house (for a resident) and 16 consecutive hours (for an intern) is the new standard. In a matter of 16 hours, only so much can be accomplished. The work-up, diagnosis, and response to treatment is hardly appreciated in this short time span. The resident, who is permitted to stay in-house for 30 hours, often completes what the intern didn’t have time to do and benefits from observing in real-time the clinical course of the patient. Is this a disservice to the intern? Many would argue “yes.”
Interns now leave work after a maximum of 16 hours. The time away from the hospital is supposed to allow for a better-work life balance, enable restorative sleep, and prevent medical mistakes. A study by Kranzler and colleagues showed that this wasn’t the case. Interns did not report an increase in well-being, a decrease in depressive symptoms, more sleep, or fewer mistakes than previously.
What about patient care/outcomes? While early data from the 16-hour work day is still forthcoming, we do have recent data from the 2003 rule that capped the work week at a maximum of 80 hours. In a study published in August 2013, Volpp and colleagues examined mortality pre- and post-80-hour work weeks. More than 13 million Medicare patients (admitted to short-term, acute-care hospitals) who had primary medical diagnoses of acute MI, CHF, or GI bleed, or surgical diagnosis in general, orthopaedic, or vascular surgery were included in the study. The authors concluded that no mortality benefit was present in the early years after the 80-hour work week was implemented and a just a trend toward improved mortality was observed in years 4-5. We will start to see mortality data from the 16-hour rule in a few years, but I suspect that no significant improvements will occur in patient outcomes. In fact, medical knowledge and hands-on experience for interns might suffer.
Completing internship used to be a rite of passage, akin to pledging a fraternity. The duty hour changes have allowed for interns to spend more time away from the hospital so that, theoretically, they are less tired and make fewer mistakes at work. In practice, this might not be the case. Unquestionably, the brutal hours that generations of past trainees faced was suboptimal. but it appears as if the current duty hour rules also might be less than ideal from a learning perspective. Hopefully, in the coming years, the ACGME will reevaluate its policies in light of the data they will see.