Specialties & Topics
- Arthritis/Rheumatic Disease
- Breast Cancer
- GERD/Peptic Ulcers
January 8th, 2014
Paul Bergl, M.D.
At first glance, no diagnosis seems more terrible than cancer. Although it remains a huge killer in the developed world, cancer has also taken on new meanings in modern medicine. As an ordinary person, I certainly fear the word and would dread the diagnosis. Cancer. It has such a damning and unforgiving ring to it. After 3 years of residency in a tertiary referral center, where I’ve seen some of the worst cases conceivable, I still cannot imagine the painful and devastating odyssey that those who succumb to it must endure.
As a recently minted physician though, I fear cancer for other reasons. The science of the field is moving at a blistering pace. How can I keep up on the state-of-the-art treatments, genomic-based diagnostic tools, and molecular therapies? (When I talk about modern cancer care, I often wondering if I even talking about things that really exist.)
The care of cancer patients discourages this generalist, because it has become exceedingly complicated. How do I craft my words to distinguish “cancer” from “pre-cancer”? What advice do I give to a patient with recent biopsy-proven, localized prostate cancer? Will I be sued for negligence I didn’t offer chemoprophylaxis for breast cancer in a patient who develops metastatic disease on my watch? How can I watch expensive third-line chemotherapy being given to one of my patients while another patient eats his way to a cancer-causing BMI of 40 on a low-cost, high-carb diet?
Given these questions, I thought I would begin 2014 with a reflection on what cancer means to the general practitioner.
Cancer as Preventable Disease
Despite all the advances we have made in diagnosing and treating cancer, we still face awesome opportunities to curtail cancer before it even starts. During the past several decades, we have clearly made strides in preventing cancer, particularly in the realm of curtailing tobacco use. (Then again, tobacco use rates aren’t really all that different than they were 10 years ago.) And, all the while, our nation is growing increasingly obese — so much so, that obesity threatens to overtake tobacco as the major preventable cause of cancer.
Given these trends, I sense that progress toward preventing cancer has stalled. I also wonder if enough clinicians are even considering the fact that cancer is preventable at all. When I give the lifestyle pep talk in clinic, I am usually warning patients about risks for developing cardiovascular disease or diabetes, not cancer. I also feel somewhat powerless to affect a patient’s ability to avoid cancer through lifestyle interventions.
These days, we need continued dedication to training physicians to coach patients about lifestyle improvements. We also must bridge the divide between medical providers and our public health leaders and find more creative solutions than exploding cigarette taxes or rehashing ideas about food deserts, fat taxes, and junk food advertisements.
Besides preventing cancer by recommending lifestyle adjustments, the generalist must also augment his use of chemoprophylaxis when indicated. For example, even though the USPSTF reaffirmed its grade B rating for chemoprevention of breast cancer in high-risk individuals in 2013, most of us don’t adhere to these guidelines very stringently (NEJM JW Womens Health Apr 8 2010), especially compared with our adherence to other grade B recommendations, like mammography. We will have even more options as aromatase inhibitors emerge as chemoprevention, so we generalists will need to keep up to speed in this field. Of course, we might be able to use less targeted chemopreventive techniques, like aspirin for colorectal cancer and will need to know the risks and benefits of these options, too.
Less Screening and More Expectant Management of Cancer
Although oncologists might argue that “targeted therapy” or “pharmacogenomics” are the buzzwords that describe the future of cancer care, my own generalist-biased ears hear “overdiagnosis” everywhere. Most clinicians probably think of indolent prostate cancer and the PSA debate when they hear this term, but plenty of buzz surrounds overdiagnosis for other reasons. Part of the issue is the desire to redefine clinical entities that have often come with the bleak label of “cancer.” For example, the debate over DCIS has shifted from how to treat it to how we even describe it to patients. And, clearly, what we call DCIS does matter.
We also have new screening modalities that have generated excitement, such as the USPSTF and American Cancer Society’s endorsement of low-dose chest CT for lung cancer. Clinicians must remain circumspect about use of this screening tool though, as chest CT itself can reveal countless false positives and also carries serious risk for overdiagnosis. And, like the PSA/prostate cancer debate I’ve seen unfold over my training career, low-dose chest CT can lead to expensive, debilitating, and potentially deadly complications from biopsies and excessive cancer treatment.
All of this talk of overdiagnosis also makes me wonder where the medical community will draw the line on whom to screen. I wonder how willing the public will be to accept expectant management as a treatment option. The American Cancer Society already has published patient information for managing prostate cancer expectantly, but how often will patients with something more deadly — say, lung cancer — opt for “just watching it”?
Cancer at the Crux of the Medical Economics Arguments
Finally, all of these cancer-related issues are bound to intersect at the most timely of all topics in medicine: cost-effective care. That cancer care is extremely expensive is no secret. Thus, we will need to be more selective in our use of cancer treatment modalities. Will our payers begin to curb use of treatment modalities that do not confer a defined benefit for their cost, such as radiotherapy for prostate cancer? And on the question of cost-effective screening, will we continue to find more cost-effective ways to identify cancer early (like HPV testing every 5 years for detecting cervical cancer)?
Cancer is no longer the ultimate evil that must be detected early and destroyed at all costs. I don’t know that it ever was, but I do know that decision-making around prevention, detection, and treatment of cancer has become more nuanced than ever before.
December 11th, 2013
Paul Bergl, M.D.
As Dr. David Green reported this week in NEJM Journal Watch, the American Society of Hematology is the latest society to comment on appropriate and cost-conscious care in the ABIM Choosing Wisely campaign. I’ve followed the Choosing Wisely campaign closely and have been using it on the wards and in clinic as academic ammunition. A specialist society’s public advice about showing restraint is an excellent means to challenge the dogma of our so-called routine practices.
I know every conscientious practitioner has struggled with the high price of medical care. Our training environments are currently breeding grounds — and battlegrounds, for that matter — for ideas on how to solve our nation’s cost crisis. I have often wondered how we might change the way we train our residents and teach our students to exhibit financial diligence.
Of course, we are all part of this economic mess, and residents rightly share some of the blame. As naïve practitioners who lack confidence in diagnosis and management, residents tend to overorder and overtreat. I certainly have checked a thyrotropin (TSH) level in the inexplicably tachycardic hospitalized patient, despite my own knowledge that it was probably worthless. And I’ve seen colleagues get echocardiograms “just to make sure” they could safely administer large amounts of IV fluid for hypovolemic patients with hypercalcemia or DKA. When residents don’t have years of experience, they use high-tech diagnostic testing as a crutch.
Then again, the expectations of the learning environment also contribute to the epidemics of excessive echocardiograms and needless TSH levels. First of all, trainees are expected to have their patients presented in neat little bundles, devoid of any diagnostic uncertainty. Additionally, they have been trained through years of positive reinforcement for broad differential diagnoses and suggesting additional testing for unsolved clinical problems.
Although the Choosing Wisely campaign speaks to me and many of my generation, it is only a start. It alone cannot stand up to the decades of decadence and our culture of waste. How can we encourage trainees to truly choose wisely in the training environment? I propose the following:
- Deploy pre-clinical curricula that emphasize value-based medical decision-making. As much as students lament the breadth and depth of their curricula, pre-clinical students have fresh, open minds and are actually receptive to learning about cost-consciousness. We cannot expect that the curricula in residency or CME efforts will have an effect on our cost-ignorant model of care.
- Include cost-conscious ordering and prescribing in our board examinations. I have seen some change from when I took the USMLE Step 1 in 2008, but I notice that clinical board questions still usually ask for a “best next step” that usually doesn’t include “expectant management” as an option. As trainees prepare for these exams, they develop a line of thinking that then permeates clinical practice. When patients with chronic musculoskeletal complaints and unremarkable radiographs are referred for MRIs rather than receiving reassurance, we can put some of the blame on our licensing exams.
- Reward trainee restraint. Residents and students should be commended for not working up insubstantial problems, withholding unnecessary treatments, and showing prudence in choosing diagnostics. Again, our educational constructs are to blame, because we reward expansive thinking and “not missing” things. In morning reports and other case conferences, we often praise residents for adding another diagnostic possibility rather than exhibiting “diagnostic restraint” or cost-conscious care.
- Give trainees some sense of the cost and price of tests and treatments. The literature has not consistently shown that giving physicians cost or price information will prevent wastefulness. But as far as I know, these studies have focused on clinicians in practice who are wedded to their ways. From my experience, trainees thirst for this type of information. Frankly, we are all clueless about how much a chest CT costs. How much was the machine? Are there separate bills for the scan and for the radiologist’s interpretation? How much is the patient expected to pay? What will insurance pay?
- Get leadership buy-in at academic centers. I am neither a healthcare economist nor a chief financial officer. But my experience as a chief resident has taught me that buy-in from the academic leadership is necessary to turn the tide on monumental tasks.
November 25th, 2013
Paul Bergl, M.D.
Nothing puts more fear into the heart of an internist than a dermatologic chief complaint. And for good reason: we have very little exposure to the breadth of the field. To us, all rashes seem to be maculopapular, all bumps are pustules… or was that nodules?
It’s not that we internists don’t care about the skin or don’t appreciate its complexity. Rather, we simply haven’t seen enough bumps, rashes, and red spots to sort them all out consistently.
On the topic of pattern recognition in medicine, an oddly titled NEJM Journal Watch piece Quacking Ducks grabbed my attention recently. The commentary by Mark Dahl summarizes a J Invest Dermatol article by Wazaefi et al. that discusses the pattern identification and other cognitive processes involved in discerning suspicious nevi. I will try to distill the interesting discussion to the main points of Dr. Dahl’s summary and the index article:
- Experienced dermatologists use other cognitive processes besides the “ABCD” method for finding suspicious nevi.
- Most healthy adults have only two or three dominant patterns of nevi on their bodies.
- Deviations from the patient’s own pattern usually represent suspicious nevi. These deviations are referred to as “ugly ducklings.”
- Even untrained practitioners can cluster nevi based on patterns and can identify which nevi deviate from the patterns.
- However, expert skin examiners tend to cluster nevi more reliably and into a smaller number of groups.
- Identifying potential melanomas by seeking out “ugly duckling” nevi is both an exceedingly simple and cognitively complex means of finding cancer.
So, what is the take-home point? To make diagnoses, dermatologists use their visual perception skills, some of which are innate and some of which are honed through practice. While technology threatens to overtake the task of perception — see the MelaFind device, for example — human perceptiveness is still difficult to qualify, quantify, and teach.
A colleague of mine and a faculty radiologist at my institution, David Schacht, has pondered the very question of visual perceptiveness among trainees in his own specialty of mammography. As your probably realize, computer-aided diagnosis has risen to prominence as a way to improve radiologists’ detection of subtle suspicious findings on mammograms. These computerized algorithms lessen the chance of false-negative tests. However, a radiologist ultimately still interprets the study; as such, radiologists still need training in visual perception. But how does a radiologist acquire this “skill”? Dr. Schacht hypothesizes that radiology residents who review a large series of enriched mammograms will have better cancer-detection rates. In other words, he hopes that intensive and directed case review will improve visual perception.
Clearly, mammographers and dermatologists are not alone in making diagnoses by what they see. Every field relies on some degree of astute observation that often becomes second nature over time. Even something as simple as the general appearance of a patient in the emergency room holds a trove of clues.
My question is, can these perceptive abilities be better taught to trainees or even be programmed into a computer? Or should we simply assume that experience itself drives some otherwise unexplained improvement in visual diagnosis?
If the former is true, then we ought to seek a better understanding of how physicians glean these skills. If man should stay ahead of machine, then we clinicians should hone our intuition and our abilities to recognize visual patterns. Moreover, we should design education systems that promote more visual engagement and activate the cortical pathways that underpin perceptiveness.
On the other hand, if experience itself imbues clinicians with better perceptive skills, then we really ought to maximize the number of clinical exposures for our trainees. No matter what the field, students and residents might simply need to see a lot more cases, either simulated or real.
As human or computer perceptiveness evolves, even the most expert eyes or finest computer algorithms will still be limited . And ultimately, any homely duckling of a nevus probably deserves a biopsy. But with biopsies, aren’t we trading one set of expert eyes for another — in this case, the pathologist — when we send that specimen to the lab?
In the end, the prevailing message seems to be that repeated experiences breed keen and reliable observations. We cannot discount the very basic human skills of visual cues. We should continue to seek ways to refine, study, and computerize our own perceptiveness.
November 14th, 2013
Paul Bergl, M.D.
This past week, I attended the annual AAMC meeting where the question, “What will medical education look like in 2033?” was asked in a session called “Lightyears Beyond Flexner.” This session included a contest that Eastern Virginia Medical School won by producing a breath-taking and accurate portrayal of 2033. I encourage you to view the excellent video above that captures the essence of where academic medicine is headed.
After this thought-provoking session, I too pondered academic medicine’s fate. I would like to share my reflections in this forum.
Without question, technology stood out as a major theme in this conference. And for good reason: clearly it is already permeating every corner of our academic medical lives. But as technology outpaces our clinical and educational methods, how exactly will it affect our practices in providing care and in training physicians?
Our educational systems will evolve in ways we cannot predict. But in reality, the future is already here as transformations are already afoot. MOOCs — massive open online course for the uninitiated — like Coursera are already providing higher education to the masses and undoubtedly will supplant lectures in med schools and residencies. In a “flipped classroom” era, MOOCs will empower world renowned faculty to teach large audiences. Meanwhile, local faculty can mentor trainees and model behaviors and skills for learners. Dr. Shannon Martin, a junior faculty at my institution, has proposed the notion of a “flipped rounds” in the clinical training environment, too. In this model, rounds include clinical work and informed discussions; reading articles as a group or having a “chalk talk” are left out of the mix. In addition, medical education will entail sophisticated computer animations, interactive computer games for the basic sciences, and highly intelligent simulations. Finally, the undergraduate and graduate curricula will have more intense training in the social sciences and human interaction. In a globalized and technologized world, these skills will be at a premium.
But why stop at flipped classrooms or even flipped rounds? Flipped clinical experiences are coming soon too.
Yes, technology will revolutionize the clinical experience as well. Nowadays, we are using computers mainly to document clinical encounters and to retrieve electronic resources. In the future, patients will enter the exam room with a highly individualized health plan generated by a computer. A computer algorithm will review the patient’s major history, habits, risk factors, family history, biometrics, previous lab data, genomics, and pharmacogenomic data and will synthesize a prioritized agenda of health needs and recommended interventions. Providers will feel liberated from automated alerts and checklists and will have more time to simply talk to their patients. After the patient leaves the clinic, physicians will then stay connected with patients through social networking and e-visits. Physicians will even receive feedback on their patient’s lives through algorithms that will process each patient’s data trail: how often they are picking up prescriptions, how frequently they are taking a walk, how many times they buy cigarettes in a month. And of course, computers will probably even make diagnoses some day, as IBM’s Watson or the Isabel app aspire to do.
Yet even if Watson or Isabel succeeds in skilled clinical diagnosis, these technologies will not render physicians obsolete. No matter how much we digitize our clinical and educational experiences, humans will still crave the contact of other humans. We might someday completely trust a computer to diagnose breast cancer for us, but would anyone want a computer to break the bad news to our families? Surgical robots might someday drive themselves, but will experienced surgeons and patients accede ultimate surgical authority to a machine? A computer program might automatically track our caloric intake and physical activities, but nothing will replace a motivating human coach.
With all of these changes, faculty will presumably find time for our oft-neglected values. Bedside teaching will experience a renaissance and will focus on skilled communication. Because the Google Generation of residents and students will hold all of the world’s knowledge in the palm of their hands, they will look to faculty to be expert role models. Our medical educators will be able to create a truly streamlined, ultra-efficient learning experience that allows more face-to-face experiences with patients and trainees alike.
So where is academic medicine headed beyond Flexner? Academic physicians will remain master artists, compassionate advisers, and a human face for the increasingly digitized medical experience.
November 7th, 2013
Paul Bergl, M.D.
In my 3 years of residency, the nearly universal resident response to outpatient continuity clinic was a disturbing, guttural groan. I recognize that many aspects of primary care drag down even the most enduring physicians. But I have also found primary care — particularly with a panel of high-risk and complex patients — to be a welcome challenge. I recently spoke with one of my institution’s main advocates for academic primary care, who I know has wrestled with this standard resident reaction.
And we had a shared epiphany about the one of the main deterrent driving promising residents away from primary care: inadequate training in prioritizing outpatient problems.
It’s easy to see how primary care quickly overwhelms the inexperienced provider. An astonishing number of recommendations, guidelines, screenings, and vaccinations must compete with the patient’s own concerns and questions. This competition creates an immense internal tension for the resident who knows the patient’s needs — cardiovascular risk reduction, cancer screenings, etc. – but is faced with addressing competing problems.
At my institution, the resident continuity clinic also houses a generally sicker subset of patients. Take these complex patients and put them in the room with a thoughtful resident that suffers from a hyper-responsibility syndrome, and you get the perfect mix of frustration and exhaustion.
There are diverse external pressures that also conspire to make the trainee feel like he should do it all: the attending physician’s judgment, government-measured quality metrics, and expert-written guidelines for care.
In this environment of intense internal and external pressures, how can we give residents appropriate perspective in a primary care clinic?
I would argue that residency training needs to include explicit instruction on how to prioritize competing needs. Maybe residents intuitively prioritize health problems already, but I’ve witnessed enough of my residents’ frustrations with primary care and preventive health. Let’s face it: there is probably a lot more value in a colonoscopy than an updated Tdap, but we don’t always emphasize this teaching point.
At times, perceptions of relative value can be skewed. “Every woman must get an annual mammogram” is a message hammered into the minds of health professionals and lay people. Yet counseling on tobacco abuse might give the provider and patient a lot more bang for the buck. Our medical education systems do not emphasize this concept very well. Knowing statistics like the number needed to treat (NNT) might be a rough guide for the overwhelmed internist, but the NNT does take into account the time invested in clinic to arrange for preventive care or the cost of the intervention.
Worse yet, there are simply too many recommendations and guidelines these days. Sure these guidelines are often graded or scored. But as Allan Brett recently pointed out in Journal Watch, many guidelines conflict with other societies’ recommendations or have inappropriately strong recommendations. Residents and experienced but busy PCPs alike are in no position to sift through this mess.
Our experts in evidence-based medicine need to guide us toward the most relevant and pressing needs, guidelines about guidelines, so to speak. We need our educational and policy leaders to help reign in the proliferation of practice guidelines rather than continuing to disseminate them.
Physicians in training have their eyes toward the potential prospects of pay-for-performance reimbursements and public reporting of physicians’ quality scores. No one wants to enter a career in which primary care providers are held accountable for an impossibly large swath of “guideline-based” practices.
So how might we empower our next generation of physicians to feel OK with simply leaving some guidelines unfollowed? In my experience, our clinician-educators contribute to the problem with suggestions like, “You know, there are guidelines recommending screening for OSA in all diabetics.” Campaigns like “Choosing Wisely” represent new forays into educating physicians on how to demonstrate restraint, but they do not help physicians put problems in perspective. Our payors don’t seem to have any mechanism to reward restraint or prioritization and can, in fact, skew our priorities further.
I hope that teaching relative value and the art of prioritizing problems will be a first and critical step toward getting the next generation of physicians excited about primary care.
October 3rd, 2013
Paul Bergl, M.D.
“What do you think, Doctor?”
For a novice physician, these worlds can quickly jolt a relatively straightforward conversation into a jumble of partially formed thoughts, suppositions, jargon, and (sometimes) incoherent ramblings. Even for simpler questions, the fumbling trainee does not have a convenient script that has been refined through years of recitation. Thus, many conversations that residents have with patients are truly occurring for the first time. And unfortunately, this novelty can result in poorly chosen words that can have lasting effects. An inauspicious slip of the tongue could significantly alter the patient’s perceptions and decisions.
A recent study in JAMA Internal Medicine highlighted the connection between the words we choose and the actions our patients take. As Dr. Andrew Kaunitz reported in a summary of this article for NEJM Journal Watch, avoiding the word “cancer” in describing hypothetical cases of DCIS resulted in women choosing less aggressive measures and avoiding procedures like surgical resection. Dr. Kaunitz notes that the National Cancer Institute has also recommended against labeling DCIS as “cancer.”
I believe this recent study raises a major question at every level of medical education, particularly during residency. How do we counsel our physicians to counsel patients and communicate effectively? How do we choose words that appropriately convey the urgency of a diagnosis without scaring our patients into unnecessarily risky treatments?
I was fortunate to have gone to a medical school where our course directors were forward-thinking. Our preclinical curriculum included mentored sessions with practicing physicians and patient-actors. We were taught how to use motivational interviewing to facilitate smoking cessation, how to ask about a patient’s sexual orientation and gender identification, and even how to use the SPIKES mnemonic for breaking bad news.
Yet all of these simulations can never really prepare a trainee for the first time he or she is on the spot in the role of a real physician with a real patient. Counseling patients requires subtleties; no medical school curriculum could possibly address every situation. But as the above-referenced study by Omer at al. confirms, subtleties and shades of meaning matter.
Of course, not every situation that requires measured words will be so dramatic. The words we choose for more benign situations matter, too. How do I define a patient’s systolic blood pressure that always runs in the 130-139 mm Hg range? Do I tell him that he has “prehypertension,” or do I give him a pass in saying, “Your blood pressure runs a little higher than normal”? Would this patient exercise more if I choose “prehypertension”?
What do I say to my patient with a positive HPV on reflex testing of an abnormal Pap smear? “Your Pap smear shows an abnormality” or “an infection with a very common virus…” or “suspicious cells” or “an infection that might lead to cancer someday…” What effect will it have on how she approaches future abnormal results? How might she change her sexual habits? How might she view her partner or partners?
These days, med schools and residency programs are being increasingly taxed by the competing priorities in educating the 21st-century physician. As technology rapidly threatens to usurp our expertise in many domains of practice, communication will remain a cornerstone of our job. Communicating with patients will never be passe; no patient wants to be counseled by a computer. We must master these vital communication skills.
Unfortunately, studies like Omer et al. only touch the tip of the iceberg. How do we empower patients with information while not scaring them? When should we scare them with harsher language? How and where do we learn to choose the right words? And who decides?
Even a perfectly designed training system could not tackle all these questions. But perhaps we ought to be exposing our trainees to simulated conversations in which subtle word choices matter. Let’s face it: We could all do better at choosing our words wisely.
September 27th, 2013
Akhil Narang, M.D.
Discussions of resident duty hour reforms reached the point of ad nauseam a few years ago. Everyone had their say – Program Directors (“In 2003 we instituted an 80-hour work week, in 2011 we switched to 16 hour shifts, what’s next – online residencies!?”), senior residents (“What? I have to write H&Ps again? I don’t even know my computer password!”), interns (“I thought I was done with cross-covering after this year”), graduating medical students (“I get to sleep in MY bed most of next year!”), and various supervising bodies (“This is what the public wants. Of course there is evidence that these reforms will work.”). Now it’s my turn: part of the last class to have experienced 30-hour call cycles as interns – the way it should/shouldn’t be (depending on your bias).
While lamenting to my Program Director during residency on how my class not only had a difficult intern year but also had to assume “intern responsibilities” during my junior and senior years, he gently reminded me of his experience as an intern. It was routine for him to care for more than 20 patients on the general medicine service. Moreover, the ICU was “open” and any of his patients transferred to the unit continued to be under his care. Generously assuming 1 day off in 7, he worked more 100-hour work weeks than he’d care to remember.
As a junior resident, I was on service with my Chair of Medicine and he repeated many of the same stories of busy services and how the word housestaff came to be – the residents’ de facto house was the hospital. Was this dangerous? The unfortunate case of Libby Zion (and others) would suggest yes. Did my attendings became outstanding physicians, in part because of the rigorous training? Unequivocally.
Fast forward a few decades: for numerous reasons, including public pressure, an 80-hour work weeks with a maximum of 30 consecutive hours in-house (for a resident) and 16 consecutive hours (for an intern) is the new standard. In a matter of 16 hours, only so much can be accomplished. The work-up, diagnosis, and response to treatment is hardly appreciated in this short time span. The resident, who is permitted to stay in-house for 30 hours, often completes what the intern didn’t have time to do and benefits from observing in real-time the clinical course of the patient. Is this a disservice to the intern? Many would argue “yes.”
Interns now leave work after a maximum of 16 hours. The time away from the hospital is supposed to allow for a better-work life balance, enable restorative sleep, and prevent medical mistakes. A study by Kranzler and colleagues showed that this wasn’t the case. Interns did not report an increase in well-being, a decrease in depressive symptoms, more sleep, or fewer mistakes than previously.
What about patient care/outcomes? While early data from the 16-hour work day is still forthcoming, we do have recent data from the 2003 rule that capped the work week at a maximum of 80 hours. In a study published in August 2013, Volpp and colleagues examined mortality pre- and post-80-hour work weeks. More than 13 million Medicare patients (admitted to short-term, acute-care hospitals) who had primary medical diagnoses of acute MI, CHF, or GI bleed, or surgical diagnosis in general, orthopaedic, or vascular surgery were included in the study. The authors concluded that no mortality benefit was present in the early years after the 80-hour work week was implemented and a just a trend toward improved mortality was observed in years 4-5. We will start to see mortality data from the 16-hour rule in a few years, but I suspect that no significant improvements will occur in patient outcomes. In fact, medical knowledge and hands-on experience for interns might suffer.
Completing internship used to be a rite of passage, akin to pledging a fraternity. The duty hour changes have allowed for interns to spend more time away from the hospital so that, theoretically, they are less tired and make fewer mistakes at work. In practice, this might not be the case. Unquestionably, the brutal hours that generations of past trainees faced was suboptimal. but it appears as if the current duty hour rules also might be less than ideal from a learning perspective. Hopefully, in the coming years, the ACGME will reevaluate its policies in light of the data they will see.
September 16th, 2013
Paul Bergl, M.D.
This past week in NEJM Journal Watch General Medicine, Abigail Zuger reviewed an article from the Journal of General Internal Medicine by Lauren Block et al. in which researchers examined how medical interns spend their time. The results from this time motion study might be concerning but are not unexpected. The investigators found that interns on inpatient rotations spend only 12% of their time in direct patient care and spend only 8 minutes daily with each patient on their inpatient services. Dr. Zuger notes this “distressing paucity” of direct patient care should cause leaders in graduate medical training to effect change in interns’ daily routines.
To those of us in training or just out of training, the reasons for less direct patient care are myriad and obvious:
- A focus on multidisciplinary care and ever-increasing specialization results in each medical patient having a dozen or more physicians, consultants, nurses, pharmacists, case managers, social workers, and therapists directly involved in care. At this core of this legion of providers stands the intern. This novice physician must field messages, pages, and advice from all arms of the treatment team. As such, the intern spends as much time coordinating care as he or she spends relaying messages and answering “quick questions” — which are never quick and rarely are questions.
- Patient acuity in academic centers also continues to rise. Patients on medical wards often are admitted with multiple comorbidities and in a state of disarray. Rare are the relatively straightforward admissions for uncomplicated pneumonia or CHF exacerbation. Thus, interns have to manage a number of active conditions, complex medication lists, and a barrage of patient data.
- In addition, the Centers for Medicare and Medicaid pay us by DRGs and have also inculcated us to prevent 30-day readmissions. CMS will soon ask us to admit more patients to observation status. How do all of these shifts in payment affect a house officer? The intern has to spend all the more time ensuring safe and timely discharges. Tasks like medication reconciliations and communicating with outpatient providers suck up even more of the intern’s day at the expense of face time with the patient.
- Finally, documentation steals many a precious minute from the day. The considerations of patient acuity and reimbursement add to the burden of documentation leading to bloated notes that take far too much time to construct.
I am probably too young to be so cynical, but I do not see a shift in these routines occurring any time soon. And without be excessively cantankerous, I feel obligated to ask, “Does the percentage of interns’ time spent in direct patient care matter?”
An smaller percentage of interns’ time spent directly interfacing with patients may not mean that patients get worse care. We don’t have any direct data that the distressing paucity of direct patient care is resulting in poor outcomes. Moreover, the very “non-patient” tasks outlined above are entirely necessary in today’s inpatient environment. For example, if a patient is started on a LMWH bridge to warfarin in the hospital, figuring out how the LMWH will be paid for and who will follow the INR post-hospitalization is as important as time spent at the patient’s bedside.
Of course, I am not suggesting that intern work is inherently rewarding or educational. Most of us embark on this career path because we value interaction with actual human beings, not because we like electronic note templates. I myself romance about the days when internists actually took the time to perform thorough histories and physicals. But if we don’t encumber the interns with all of this work, who will do it?
September 11th, 2013
Akhil Narang, M.D.
When I started residency 4 years ago, warfarin was really the only choice of anticoagulation widely used for prevention of stroke in patients with atrial fibrillation (AF) and in patients with venous thromboembolism (VTE). Despite knowing about the coagulation cascade for decades, only recently have viable alternatives to warfarin become available. In this post, I hope to break down (at a macroscopic level) use of oral direct thrombin inhibitors (DTI). In my next post, I will review Factor Xa inhibitors.
To start, let’s review the basics of clotting (Figure below). Fibrin clots form after activation of the intrinsic and extrinsic pathways. Tissue injury initiates the intrinsic pathway and compromises the vast majority of the coagulation cascade. Warfarin inhibits vitamin-K dependent cofactors (II, VII, IV, X) in addition to protein C and protein S. Warfarin acts across both the extrinsic and intrinsic pathways to prevent thrombus formation. Direct thrombin inhibitors and Factor Xa inhibitors both work in the common pathway of coagulation.
Most DTIs are administered parenterally and used in patients with ACS and in treatment of patients with heparin-induced thrombocytopenia. Oral DTIs have been slower to reach the market. While several are under development, dabigatran received FDA approval in 2010 for prevention of stroke in patients with nonvalvular AF. The RE-LY trial, published in 2009,showed that low-dose dabigatran (110 mg twice daily) was noninferior to warfarin for preventing stroke in patients with intermediate risk nonvalvular AF (mean CHADS2 score 2.1). High-dose dabigatran (150 mg twice daily) was superior to warfarin in this population. The risk for major bleeding (compared with warfarin) was lower in the low-dose dabigatran group and similar in the high-dose dabigatran group. In 2013, the RELY-ABLE study, which extended the follow-up period (median of 4+ years) for patients in the RELY trial, showed similar rates of stroke and death in both groups of dabigatran. In another recent analysis of the RE-LY population, in which cardiovascular endpoints were weighted, researchers concluded that both doses of dabigatran have comparable benefits in prevention of stroke when taking into account safety and efficacy.
The bottom line is that dabigatran is efficacious for stroke prevention in nonvalvular AF patients. Not having to monitor one’s diet or monitor INR are practical advantages that dabigatran offers over warfarin. The inability to readily reverse bleeding associated with dabigatran should be a point of discussion with patients when choosing the appropriate anti-coagulation strategy (especially in elders or patients with histories of bleeding), however a dabigatran antidote will likely be available in the future.
Turning towards venous thromboembolism, dabigatran has been investigated for treatment of acute VTE and also for prevention of recurrent VTE. In both scenarios, dabigatran has shown positive results. In the RE-COVER study, published in 2009, patients with acute VTE were bridged with either warfarin or dabigatran (150 mg twice daily). No significant difference was found in the rates of recurrent VTE or major bleeding. Further trials (RE-MEDY and RE-SONATE) evaluated dabigatran versus warfarin or placebo in patients who completed at least 3 months of treatment for acute VTE. The results showed dabigatran treated patients had similar rates of recurrent VTE (and lower rates of major bleeding) than those on warfarin. Interestingly, a higher incidence of ACS was seen in the dabigatran-treated group. Less surprisingly, when compared with placebo, dabigatran treated patients had a lower rate of recurrent VTE (and higher rates of bleeding). As of now, dabigatran has not been FDA approved for treatment or prophylaxis of acute venous thromboembolism but is being used off-label for these indications by some clinicians.
To date, dabigatran is the sole oral DTI available in the U.S. and, as such, controls the majority market share in this class of medications. I anticipate newer generations of DTIs in the coming years. Stay tuned for the next post that will address Factor Xa inhibitors and their role in the treatment of atrial fibrillation and venous thromboembolism.
September 3rd, 2013
Paul Bergl, M.D.
As a resident, probably the most common piece of feedback one receives is, “Read more and expand your clinical knowledge base.” This critique is a standard and generic piece of feedback to encourage the younger generation to never quit in the endless pursuit of knowledge. As our erudite attendings know, medical knowledge always evolves and often reverses course. Thus, the trainee is reminded to establish the habit of keeping up on the literature early in his or her career.
Indeed there is merit to following the literature vigilantly. This past month, Vinay Prasad et al. published on “medical reversals” in the Mayo Clinic Proceedings journal. In their analysis, Prasad et al. reviewed a decade’s worth of original articles in the New England Journal of Medicine. The authors found that more than 100 original articles had overturned previous guidelines or accepted practices. Notable examples — some of which are now old news — included the following:
- The standard teaching of CPR with rescue breaths was reversed by well-designed trials showing that adequate compressions were the main objective in CPR.
- KDOQI guidelines in 2000 were updated to reflect a target hemoglobin between 11 and 13 g/dL in patients with CKD. These recommendations were subsequently refuted on the basis of randomized control trial of epoetin alfa that demonstrated no major benefits and increased risk.
- Rosiglitazone was introduced in 1999 as a treatment for type 2 diabetes mellitus based on its ability to lower hemoglobin A1c. However, a meta-analysis in the New England Journal of Medicine later concluded this drug was associated significantly with cardiovascular death.
The list goes on and can be found in the article’s supplementary materials. If you’re inspired by Prasad’s findings, you ought to make every effort to “read more” and “keep current.”
Then again, maybe better advice for trainees would be, “Read when you can, but don’t worry if you end up falling a few years behind in the literature.” Sure, a revolutionary medical practice might arise, but even residents are unlikely to be so tuned out to the world that they won’t hear about the latest breakthrough somewhere. New treatments will surface; new drugs will be manufactured. The trainee may want to see a novel therapy survive a few years of validation in real clinical practice before stepping out onto a limb of uncertainty. If one aggressively tracks every advance in medicine, one also runs the risk of adopting a practice that later proves more harmful than anticipated.
At least one recently published article speaks in favor of this more lax approach toward the literature. As Paul Mueller highlighted in a NEJM Journal Watch article this past week, many meta-analyses add very little to the growing body of medical literature except growth of the body itself. If you haven’t read a meta-analysis on the pharmacotherapeutic options for fibromyalgia or supplements to prevent colon cancer in the last handful of years, don’t worry: The article you read in 2009 had most of the same studies. A more rational approach might be to pull meta-analyses on an as-needed basis rather trying to stay ahead of the barrage of articles published weekly.
So what’s the best strategy for a trainee? I personally favor a “headlines, tweets, and abstracts” approach — not only for the time-strapped resident but for myself too. Skim a few journals each month, and subscribe to essential Twitter and RSS feeds. You might hit the right balance of staying current and staying above the fray.