-

Recent Posts

RSS

October 6th, 2014

Introducing Myself

Priya Umapathi, M.D.

Rutgers University HospitalHello! I’m excited to have an opportunity to share my adventures, experiences, and opinions from chief year with you. Transitioning between life phases can be traumatic at times, but invariably bears great potential for exponential self–growth. This year, so far, has confirmed that there is indeed much growing to be done! We held a transition event for our house staff prior to the beginning of this academic year, which I affectionately dubbed “Metamorphosis” (I know, corny), to discuss some of the expectations, roles, and responsibilities associated with becoming a senior resident. The many themes of the night included incorporating enthusiasm, intellectual curiosity, and compassion into our daily lives to become more effective leaders. I watch my new senior residents draw on these attributes daily as they navigate their way through managing patients and teams independently. We face an ever-changing landscape of medicine with challenges in implementation of healthcare reform, the work hour debate on resident and patient outcomes, the financials of medical school costs relating to choices for specialties, and the rise of genetically personalized healthcare, to name a select few… The adage, “may you live in interesting times,” has never been more apt! My hope is that I will be able to incorporate themes from our Metamorphosis event into my year as chief and look forward to sharing insights with you about both my personal and professional growth.

October 6th, 2014

NEJM Journal Watch Welcomes Priya Umapathi, MD

Charleen Hamilton

The editors and staff of NEJM Journal Watch welcome Dr. Priya Umapathi as our new Chief Resident blogger. Priya will be sharing her experiences as a teacher and mentor at Rutgers.

April 28th, 2014

Cynicism in Medicine

Akhil Narang, M.D.

WachterICUforWeb_bigger“Are you more or less cynical than when you started residency?” This was the question my Program Director asked our senior internal medicine residents at a recent dinner with Dr. Bob Wachter. If you aren’t familiar with Dr. Wachter, he is widely acclaimed as the “Father of Hospital Medicine” and a renowned champion of patient safety and quality. His blog, Wachter’s World, is chock full of insightful commentary on the American healthcare system, written with levitating optimism. In a time where criticism of doctors and hospitals (coupled with pessimism reflecting the country’s healthcare system) is trendy, Dr. Wachter’s breath of fresh air is welcoming. It got my Program Director thinking about cynicism in medicine and inspired this post.

The 30-odd residents, months shy of graduating, got an opportunity to answer whether they viewed themselves as more or less cynical than at the start of their residency training. Many of responses reflected increased cynicism toward “the healthcare system.” When pressed to explain further, many answers stemmed from the frustration they feel when taking care of patients: difficulty in establishing primary care follow-up for the uninsured, inability to get antibiotics covered by insurance, administrative red tape of setting up home oxygen therapy, and even the cumbersome process of obtaining outside hospital records. It was refreshing, however, to hear residents qualify their cynicism. More often than not, residents did not single out cynicism toward patients as much as they did toward the system. If we are to continue producing generations of passionate and dedicated physicians who don’t burn out, we need to start addressing ways to deal with cynicism.

Short of nationwide reform, hospitals and residency programs can play a part in helping to shape (arguably) the most pliable time in a young physician’s career. While it’s certainly character building to be able to successfully navigate filling out nursing home transfer forms, finding a means to get a patient’s INR checked, making follow-up appointments, and calling insurance companies to plead for antibiotic approval, this type of work should not dominate the daily cycle of residency. There is little doubt that “scut work” helps us better understand the bureaucracy and red tape associated with our healthcare system, but it also unequivocally takes away from a plethora of formal educational opportunities and it contributes to violations of strict duty-hour regulations.

In speaking to my colleagues around the county, I have found that hospitals and residency programs provide variable support to their housestaff: some of the best programs offer dedicated resident assistants (typically PAs) and streamlined workflows for discharging patients (multidisciplinary rounds, discharge planners to schedule appointments). Residents who were the least cynical in my unscientific polling were those who had the most resources at their disposal. I wonder if, down the line, the less cynical residents become less cynical fellows and subsequently less cynical attendings. I wonder if these physicians experience less burn out than their colleagues whose training programs do not equip them to navigate the healthcare maze.

Recognizing that all hospitals and programs are not created equal and that perks such as PAs or discharge coordinators are luxuries that many hospitals aren’t in a position to provide, addressing the larger issue of cynicism in medicine is important. A certain degree of cynicism is healthy but when cynicism borders on indifference or complacency, we’re in trouble. To effectively curtail cynicism directed at the “system,” hospital leadership needs to engage their residents. For many hospitals, residents provide the greatest amount of hands-on patient care. Residents are often the first and last providers that patients encounter during hospitalizations. Every hospital recognizes the importance of quality improvement and creating lean workflows; resident input and feedback should be solicited at every step of the way. Concerted efforts to address issues that plague residents (whether it be better social work support or a lack of computers) should be taken seriously.

Residents need to feel empowered by their programs and hospitals to make changes. Whether those changes are major or minor, a collaborative effort between housestaff and hospitals will inevitably be well received. Unilateral decision-making (especially if controversial) can lead to significant resentment and to worsening cynicism. I have no delusions that once residents and fellows finishing their training, challenges in their practice environments (academics, private practice, or industry) certainly can augment cynicism. Nonetheless, if the formative years of one’s training are optimized, scores of physicians might enter their post-training careers with a less cynical mindset.
So now I ask you to reflect on your experiences. Are you more or less cynical than when you started your residency training? If you’re more cynical, why and how much of this was a result of modifiable factors in your training program?

Follow Dr. Akhil Narang on Twitter @AkhilNarangMD

April 5th, 2014

Learning to Unlearn and Other Advanced Skills

Paul Bergl, M.D.

In my transition from pure learner (i.e., the med student role) to teacher-learner (i.e., the attending), I’ve actually found myself focusing more on the learner than the teacher part of my dual existence.  Strong learning seems to be requisite to strong teaching, and I am realizing that succeeding on the next level requires some extra meta-cognition, that is, learning to learn in new ways.

Learning to Unlearn

In med school, learners amass an incredible amount of new information and master a completely new language. Suffice it to say that “drinking from the firehose” probably understates the reality of undergraduate medical education.

Our schools inculcate lots of so-called “facts” into our students’ fresh minds, and said students suck up these facts like infinitely absorptive sponges. Sure, students often purge the data after cramming for tests, but they inevitably reclaim much of this knowledge over the next several years. And thus, students graduate medical school with their minds encumbered by extraordinary amounts of information to apply to patient care.

This approach unfortunately is faulty in two respects: Memory is imperfect, and facts are not immune to mutability. I have been caught on rounds reciting “facts I learned in medical school” only to have my team discover that almost no reputable sources can corroborate my claims — or even worse, that a reputable source completely refutes them. I cannot always pin down the etiology of my misinformation. I usually blame time’s effect on the faulty memory compartment. More importantly, I make a mental note to condemn that parcel of my brain and vacate it for future use.

My advice: Actively seek out the misinformation in your brain, and purge it. Identify what you thought you’ve learned but isn’t true.

Learning to Get Answers Without the Certainty of an Answer Key

As learners progress through their undergraduate and graduate training, they move from the black-and-white world of correct answers to a landscape of gray zones devoid of an answer key. Students often live and die by “what’s going to be on the test.” Even residents living in the oft-ambiguous world of clinical medicine have some anchor of certainty: the attending’s final word. No matter what shade of correct or incorrect a clinical decision is, the resident can often fall back on what the attending will want.

When there is no longer a judge of correctness — be it the professor, the course director, or the attending – it can be quite unnerving. In this situation, the teacher-learn should remember that “facts” you’ve learned are never truths. They are half-truths with varying degrees of evidence that can be variably applied to actual clinical scenarios.

My advice: When faced with situations where a correct answer cannot be known, gather all the information you can to make an informed decision. Remember that the teacher-learner becomes a de facto answer key, but be ready to adjust the grading rubric too.

Learning to Learn Critically

The learner role affords a certain luxury to students and residents: leaving all the critical thinking to the experts. For example, when you rotate on your cardiology rotation, you must recite gospel verses like, “Give an ACE inhibitor and β-blocker for everyone with reduced ejection fraction.” But what about that latest study on angiotensin receptor neprilysin inhibitors? Well, you get to leave the interpretation and its application to real-life settings to the renowned cardiology attending.

I am not saying that residents aren’t expect to think critically. But they often don’t have the time to learn critically: to analyze the latest developments and consider how to integrate the evidence into practice. Instead, students and residents defer or default to the experts (and integrate this information as “facts” into their brain; see sections above).

Now imagine what happens when no expert is present. The teacher-learner needs to be prepared to face situations in which he or she might be the one distilling very complicated data into spoon-fed “pearls.” The teacher-learner also needs to decide how much stock to put into his or her own truths.

My advice: Imagine how you would apply newly acquired knowledge to your patients before actually doing so. Someday you will not have an expert to lead the way, and you never know when your trainees might look to you for guidance.

Follow Dr. Paul Bergl on Twitter @PaulBerglMD

February 11th, 2014

Do Quality Initiatives and the Patient Safety Movement Threaten Resident Autonomy?

Paul Bergl, M.D.

sign of the times

A sign of the times?

Recently, our residency program had the excellent fortune of hosting Dr. Bob Wachter as a visiting speaker. Dr. Wachter is considered a pioneer in the hospitalist movement and has built his career around inpatient quality and safety. During lunch with Dr. Wachter, some of our residents, and hospitalist faculty, we discussed the topic of resident autonomy in the hospital. In the glory days of residency, I imagine that house officers experienced autonomy in its truest sense of the word: self-rule and utter independence. At least that’s the impression I have from Stephen Bergman’s (a.k.a Samuel Shem’s) House of God … a Lord of the Flies–like island inhabited by unsupervised and uninhibited junior physicians.

We all know that the 21st century’s inpatient environment leaves less room for such resident independence — and shenanigans, for that matter. Through regulatory and advisory bodies, patient advocacy groups, and our own recognition, we are now rightly focusing on other domains of hospital care. The main priority is not to “just let the doctors take care of patients how they want.” With this sea change, resident autonomy has evolved accordingly — both in practice and as a concept.

In case you haven’t heard, the quality and safety era is here to stay. Because true autonomy and “learning by doing” can potentially stand squarely at odds with quality metrics and the safest possible outcomes, I have to wonder:

  • Will resident autonomy disappear completely in the future?
  • How has autonomy changed already?
  • And how will trainees learn to practice independently with all of this change?

Anyone that works in graduate medical education knows that duty-hour reform fundamentally shook the resident learning experience and thus affected autonomy. Since the additional 2011 duty hours changes were enacted, teamwork has become the name of the game. Residents routinely are forced to pass off decisions or depart before the implications of their decisions materialize. Attending physicians now seemingly shoulder more of the clinical workload, too, when residents’ shifts are truncated by a requirement to leave the hospital.

There is also probably universal agreement that the imperatives to reduce hospital length of stay and to facilitate safe discharges affect resident independence. As an attending, I know the pressures that the hospital is under, and I often feel compelled to be very directive about making prompt discharge a reality.

I doubt that residents would feel that either of these changes has eliminated their autonomy completely. Often in medicine, there is no single correct way to achieve an end. For this reason, residents can still safely be given leeway in many clinical decisions. There is still some art in what we do, and autonomy lives to see another day. But there are looming changes on the horizon that might threaten resident autonomy even more. Those that spend time in the hospital training environment recognize that all participants in inpatient care will increasingly be measured by how well they do their jobs. It is hard to conceive of a system in which residents and the attendings that supervise their care will be spared from aggressive quality improvement.

The stakes are simply too high nowadays. Health care is expensive and still unacceptably unsafe and unhelpful. With mounting pressures to provide higher value care, residents’ decisions are likely to undergo more scrutiny. Why is Resident A ordering more CT scans than Resident B? Why are Resident C’s patients staying in the hospital 2 days longer than Resident D’s? In my own experience as a chief resident, I know that residents still want autonomy. Heck, autonomy was my top priority in evaluating residency programs myself. As I have interviewed and met a number of applicants to our program during this residency match season, I sense that soon-to-be trainees are also putting autonomy high on their list of values. I like to tout my program’s emphasis on autonomy. I try to foster resident growth while attending on service by relaxing the reigns. I know the term autonomy doesn’t mean what it once did. Yet I do have hope that we can still give residents the leeway to “learn by doing” while preserving the health of the patients we serve and the financial stability of our healthcare system and our country.

January 13th, 2014

Reflections of a New Attending

Akhil Narang, M.D.

During my year as a Chief Resident, I have the privilege to attend on the general medicine service for 8 weeks. I recently completed 4 weeks and, as expected, found myself in an entirely new realm of patient care and accountability. I would be remiss without recalling a few of the pivotal lessons and poignant moments that stand out.

Transitioning from resident to attending inevitably results in greater scrutiny. Despite my best efforts to prevent readmissions (especially within 30 days), I had several during my first month. A cirrhotic patient with a recurrent variceal bleed, a patient with sickle cell disease readmitted for a vaso-occlusive crisis after a sharp overnight temperature drop, and an older nursing home resident treated for a UTI who came back for seizures. Given the mounting pressure to prevent readmissions, I spent numerous hours dissecting the chart for each patient attempting to understand what went wrong and what I could have done differently. I discussed the cases with my co-Chiefs and several senior attendings. The consensus was that, in many cases, readmissions will happen. This was obvious to me as a resident but now, as an attending (especially an attending on service for the first time), I felt I had done something wrong. The increased scrutiny, coupled with a heightened sense of self-reflection, led me to forget what I learned over my years of residency — sick patients tend to get readmitted.

The ideal teaching service affords everyone the opportunity to teach. At the helm of a large team (one resident, two interns, two students, and a pharmacist), I did my best to demonstrate ultrasound IVC measurements in a hypotensive patient with heart failure before giving fluids, pointing out Quincke’s sign in aortic regurgitation, and reviewing sodium homeostasis in a patient with hypernatremia. For the first few weeks I was so concerned about being a good teacher that I neglected to be a student. Our team had admitted a patient who was struggling to breath and with newly diagnosed interstitial lung disease when my student gave a brilliant, unprompted presentation on the etiologies of ILD on rounds. Only then did I remember that my students, interns, and residents all know things I don’t. Giving them the opportunity to teach me is vital and surely won’t be forgotten.

As a resident, I took pride in efficiency. Suspicious lung mass in a smoker’s chest x-ray? No problem — I could coordinate the CT scan, bronchoscopy, and pulmonary function tests the same day. My seniors hammered into me that disposition is the goal. The longer the work-up takes, the longer the patient stays in the hospital. When fixating on the total patient census, it’s easy to neglect practicing good internal medicine. As an attending, while I respect the differences between work-ups of inpatient and outpatient problems, I also realize it’s ok to adjust asthma medications, initiate treatment for GERD, or talk about depression in a patient awaiting placement for hip fracture.

Attending on the general medicine wards has been one of the most rewarding, fun, and challenging experiences of my short academic medical career. I’ve learned too many lessons to enumerate, but perhaps the most important of all is to not lose focus on the foundation I built during residency.

January 8th, 2014

Cancer 2014 — A Modern Spin on a Tragic Diagnosis

Paul Bergl, M.D.

At first glance, no diagnosis seems more terrible than cancer. Although it remains a huge killer in the developed world, cancer has also taken on new meanings in modern medicine. As an ordinary person, I certainly fear the word and would dread the diagnosis. Cancer. It has such a damning and unforgiving ring to it. After 3 years of residency in a tertiary referral center, where I’ve seen some of the worst cases conceivable, I still cannot imagine the painful and devastating odyssey that those who succumb to it must endure.

As a recently minted physician though, I fear cancer for other reasons. The science of the field is moving at a blistering pace. How can I keep up on the state-of-the-art treatments, genomic-based diagnostic tools, and molecular therapies? (When I talk about modern cancer care, I often wondering if I even talking about things that really exist.)

The care of cancer patients discourages this generalist, because it has become exceedingly complicated. How do I craft my words to distinguish “cancer” from “pre-cancer”? What advice do I give to a patient with recent biopsy-proven, localized prostate cancer? Will I be sued for negligence I didn’t offer chemoprophylaxis for breast cancer in a patient who develops metastatic disease on my watch? How can I watch expensive third-line chemotherapy being given to one of my patients while another patient eats his way to a cancer-causing BMI of 40 on a low-cost, high-carb diet?

Given these questions, I thought I would begin 2014 with a reflection on what cancer means to the general practitioner.

Cancer as Preventable Disease

Despite all the advances we have made in diagnosing and treating cancer, we still face awesome opportunities to curtail cancer before it even starts. During the past several decades, we have clearly made strides in preventing cancer, particularly in the realm of curtailing tobacco use. (Then again, tobacco use rates aren’t really all that different than they were 10 years ago.) And, all the while, our nation is growing increasingly obese  so much so, that obesity threatens to overtake tobacco as the major preventable cause of cancer.

Given these trends, I sense that progress toward preventing cancer has stalled. I also wonder if enough clinicians are even considering the fact that cancer is preventable at all. When I give the lifestyle pep talk in clinic, I am usually warning patients about risks for developing cardiovascular disease or diabetes, not cancer. I also feel somewhat powerless to affect a patient’s ability to avoid cancer through lifestyle interventions.

These days, we need continued dedication to training physicians to coach patients about lifestyle improvements. We also must bridge the divide between medical providers and our public health leaders and find more creative solutions than exploding cigarette taxes or rehashing ideas about food deserts, fat taxes, and junk food advertisements.

Besides preventing cancer by recommending lifestyle adjustments, the generalist must also augment his use of chemoprophylaxis when indicated. For example, even though the USPSTF reaffirmed its grade B rating for chemoprevention of breast cancer in high-risk individuals in 2013, most of us don’t adhere to these guidelines very stringently (NEJM JW Womens Health Apr 8 2010), especially compared with our adherence to other grade B recommendations, like mammography. We will have even more options as aromatase inhibitors emerge as chemoprevention, so we generalists will need to keep up to speed in this field. Of course, we might be able to use less targeted chemopreventive techniques, like aspirin for colorectal cancer and will need to know the risks and benefits of these options, too.

Less Screening and More Expectant Management of Cancer

Although oncologists might argue that “targeted therapy” or “pharmacogenomics” are the buzzwords that describe the future of cancer care, my own generalist-biased ears hear “overdiagnosis” everywhere. Most clinicians probably think of indolent prostate cancer and the PSA debate when they hear this term, but plenty of buzz surrounds overdiagnosis for other reasons. Part of the issue is the desire to redefine clinical entities that have often come with the bleak label of “cancer.” For example, the debate over DCIS has shifted from how to treat it to how we even describe it to patients. And, clearly, what we call DCIS does matter.

We also have new screening modalities that have generated excitement, such as the USPSTF and American Cancer Society’s endorsement of low-dose chest CT for lung cancer. Clinicians must remain circumspect about use of this screening tool though, as chest CT itself can reveal countless false positives and also carries serious risk for overdiagnosis. And, like the PSA/prostate cancer debate I’ve seen unfold over my training career, low-dose chest CT can lead to expensive, debilitating, and potentially deadly complications from biopsies and excessive cancer treatment.

All of this talk of overdiagnosis also makes me wonder where the medical community will draw the line on whom to screen. I wonder how willing the public will be to accept expectant management as a treatment option. The American Cancer Society already has published patient information for managing prostate cancer expectantly, but how often will patients with something more deadly — say, lung cancer — opt for “just watching it”?

Cancer at the Crux of the Medical Economics Arguments

Finally, all of these cancer-related issues are bound to intersect at the most timely of all topics in medicine: cost-effective care. That cancer care is extremely expensive is no secret. Thus, we will need to be more selective in our use of cancer treatment modalities. Will our payers begin to curb use of treatment modalities that do not confer a defined benefit for their cost, such as radiotherapy for prostate cancer? And on the question of cost-effective screening, will we continue to find more cost-effective ways to identify cancer early (like HPV testing every 5 years for detecting cervical cancer)?

Cancer 2014

Cancer is no longer the ultimate evil that must be detected early and destroyed at all costs. I don’t know that it ever was, but I do know that decision-making around prevention, detection, and treatment of cancer has become more nuanced than ever before.

December 11th, 2013

Making Value-Based Decisions About Ordering Tests

Paul Bergl, M.D.

Every day, piles of money are spent on needless tests and treatments in training hospitals and clinics.

As Dr. David Green reported this week in NEJM Journal Watch, the American Society of Hematology is the latest society to comment on appropriate and cost-conscious care in the ABIM Choosing Wisely campaign. I’ve followed the Choosing Wisely campaign closely and have been using it on the wards and in clinic as academic ammunition. A specialist society’s public advice about showing restraint is an excellent means to challenge the dogma of our so-called routine practices.

I know every conscientious practitioner has struggled with the high price of medical care. Our training environments are currently breeding grounds — and battlegrounds, for that matter — for ideas on how to solve our nation’s cost crisis. I have often wondered how we might change the way we train our residents and teach our students to exhibit financial diligence.

Of course, we are all part of this economic mess, and residents rightly share some of the blame. As naïve practitioners who lack confidence in diagnosis and management, residents tend to overorder and overtreat. I certainly have checked a thyrotropin (TSH) level in the inexplicably tachycardic hospitalized patient, despite my own knowledge that it was probably worthless. And I’ve seen colleagues get echocardiograms “just to make sure” they could safely administer large amounts of IV fluid for hypovolemic patients with hypercalcemia or DKA. When residents don’t have years of experience, they use high-tech diagnostic testing as a crutch.

Then again, the expectations of the learning environment also contribute to the epidemics of excessive echocardiograms and needless TSH levels.  First of all, trainees are expected to have their patients presented in neat  little bundles, devoid of any diagnostic uncertainty. Additionally, they have been trained through years of positive reinforcement for broad differential diagnoses and suggesting additional testing for unsolved clinical problems.

Although the Choosing Wisely campaign speaks to me and many of my generation, it is only a start. It alone cannot stand up to the decades of decadence and our culture of waste. How can we encourage trainees to truly choose wisely in the training environment? I propose the following:

  • Deploy pre-clinical curricula that emphasize value-based medical decision-making. As much as students lament the breadth and depth of their curricula, pre-clinical students have fresh, open minds and are actually receptive to learning about cost-consciousness. We cannot expect that the curricula in residency or CME efforts will have an effect on our cost-ignorant model of care.
  • Include cost-conscious ordering and prescribing in our board examinations. I have seen some change from when I took the USMLE Step 1 in 2008, but I notice that clinical board questions still usually ask for a “best next step” that usually doesn’t include “expectant management” as an option.  As trainees prepare for these exams, they develop a line of thinking that then permeates clinical practice. When patients with chronic musculoskeletal complaints and unremarkable radiographs are referred for MRIs rather than receiving reassurance, we can put some of the blame on our licensing exams.
  • Reward trainee restraint. Residents and students should be commended for not working up insubstantial problems, withholding unnecessary treatments, and showing prudence in choosing diagnostics. Again, our educational constructs are to blame, because we reward expansive thinking and “not missing” things. In morning reports and other case conferences, we often praise residents for adding another diagnostic possibility rather than exhibiting “diagnostic restraint” or cost-conscious care.
  • Give trainees some sense of the cost and price of tests and treatments. The literature has not consistently shown that giving physicians cost or price information will prevent wastefulness. But as far as I know, these studies have focused on clinicians in practice who are wedded to their ways. From my experience, trainees thirst for this type of information. Frankly, we are all clueless about how much a chest CT costs. How much was the machine? Are there separate bills for the scan and for the radiologist’s interpretation? How much is the patient expected to pay? What will insurance pay?
  • Get leadership buy-in at academic centers. I am neither a healthcare economist nor a chief financial officer. But my experience as a chief resident has taught me that buy-in from the academic leadership is necessary to turn the tide on monumental tasks.

 

November 25th, 2013

I Think I’ve Seen This One Before: Learning to Identify Disease

Paul Bergl, M.D.

Nothing puts more fear into the heart of an internist than a dermatologic chief complaint. And for good reason: we have very little exposure to the breadth of the field. To us, all rashes seem to be maculopapular, all bumps are pustules… or was that nodules?

It’s not that we internists don’t care about the skin or don’t appreciate its complexity. Rather, we simply haven’t seen enough bumps, rashes, and red spots to sort them all out consistently.

On the topic of pattern recognition in medicine, an oddly titled  NEJM Journal Watch piece Quacking Ducks grabbed my attention recently. The commentary by Mark Dahl summarizes a J Invest Dermatol article by Wazaefi et al. that discusses the pattern identification and other cognitive processes involved in discerning suspicious nevi. I will try to distill the interesting discussion to the main points of Dr. Dahl’s summary and the index article:

  • Experienced dermatologists use other cognitive processes besides the “ABCD” method for finding suspicious nevi.
  • Most healthy adults have only two or three dominant patterns of nevi on their bodies.
  • Deviations from the patient’s own pattern usually represent suspicious nevi. These deviations are referred to as “ugly ducklings.”
  • Even untrained practitioners can cluster nevi based on patterns and can identify which nevi deviate from the patterns.
  • However, expert skin examiners tend to cluster nevi more reliably and into a smaller number of groups.
  • Identifying potential melanomas by seeking out “ugly duckling” nevi is both an exceedingly simple and cognitively complex means of finding cancer.

So, what is the take-home point? To make diagnoses, dermatologists use their visual perception skills, some of which are innate and some of which are honed through practice. While technology threatens to overtake the task of perception — see the MelaFind device, for example — human perceptiveness is still difficult to qualify, quantify, and teach.

A colleague of mine and a faculty radiologist at my institution, David Schacht, has pondered the very question of visual perceptiveness among trainees in his own specialty of mammography. As your probably realize, computer-aided diagnosis has risen to prominence as a way to improve radiologists’ detection of subtle suspicious findings on mammograms. These computerized algorithms lessen the chance of false-negative tests. However, a radiologist ultimately still interprets the study; as such, radiologists still need training in visual perception. But how does a radiologist acquire this “skill”? Dr. Schacht hypothesizes that radiology residents who review a large series of enriched mammograms will have better cancer-detection rates. In other words, he hopes that intensive and directed case review will improve visual perception.

Clearly, mammographers and dermatologists are not alone in making diagnoses by what they see. Every field relies on some degree of astute observation that often becomes second nature over time. Even something as simple as the general appearance of a patient in the emergency room holds a trove of clues.

My question is, can these perceptive abilities be better taught to trainees or even be programmed into a computer? Or should we simply assume that experience itself drives some otherwise unexplained improvement in visual diagnosis?

If the former is true, then we ought to seek a better understanding of how physicians glean these skills. If man should stay ahead of machine, then we clinicians should hone our intuition and our abilities to recognize visual patterns. Moreover, we should design education systems that promote more visual engagement and activate the cortical pathways that underpin perceptiveness.

On the other hand, if experience itself imbues clinicians with better perceptive skills, then we really ought to maximize the number of clinical exposures for our trainees. No matter what the field, students and residents might simply need to see a lot more cases, either simulated or real.

As human or computer perceptiveness evolves, even the most expert eyes or finest computer algorithms will still be limited . And ultimately, any homely duckling of a nevus probably deserves a biopsy. But with biopsies, aren’t we trading one set of expert eyes for another — in this case, the pathologist — when we send that specimen to the lab?

In the end, the prevailing message seems to be that repeated experiences breed keen and reliable observations. We cannot discount the very basic human skills of visual cues. We should continue to seek ways to refine, study, and computerize our own perceptiveness.

November 14th, 2013

The Google Generation Goes to Med School: Medical Education in 2033

Paul Bergl, M.D.

This past week, I attended the annual AAMC meeting where the question, “What will medical education look like in 2033?” was asked in a session called “Lightyears Beyond Flexner.” This session included a contest that Eastern Virginia Medical School won by producing a breath-taking and accurate portrayal of 2033. I encourage you to view the excellent video above that captures the essence of where academic medicine is headed.

After this thought-provoking session, I too pondered academic medicine’s fate. I would like to share my reflections in this forum.

Without question, technology stood out as a major theme in this conference. And for good reason: clearly it is already permeating every corner of our academic medical lives. But as technology outpaces our clinical and educational methods, how exactly will it affect our practices in providing care and in training physicians?

Our educational systems will evolve in ways we cannot predict. But in reality, the future is already here as transformations are already afoot. MOOCs — massive open online course for the uninitiated — like Coursera are already providing higher education to the masses and undoubtedly will supplant lectures in med schools and residencies. In a “flipped classroom” era, MOOCs will empower world renowned faculty to teach large audiences. Meanwhile, local faculty can mentor trainees and model behaviors and skills for learners. Dr. Shannon Martin, a junior faculty at my institution, has proposed the notion of a “flipped rounds” in the clinical training environment, too. In this model, rounds include clinical work and informed discussions; reading articles as a group or having a “chalk talk” are left out of the mix. In addition, medical education will entail sophisticated computer animations, interactive computer games for the basic sciences, and highly intelligent simulations. Finally, the undergraduate and graduate curricula will have more intense training in the social sciences and human interaction. In a globalized and technologized world, these skills will be at a premium.

But why stop at flipped classrooms or even flipped rounds? Flipped clinical experiences are coming soon too.

Yes, technology will revolutionize the clinical experience as well. Nowadays, we are using computers mainly to document clinical encounters and to retrieve electronic resources. In the future, patients will enter the exam room with a highly individualized health plan generated by a computer. A computer algorithm will review the patient’s major history, habits, risk factors, family history, biometrics, previous lab data, genomics, and pharmacogenomic data and will synthesize a prioritized agenda of health needs and recommended interventions. Providers will feel liberated from automated alerts and checklists and will have more time to simply talk to their patients. After the patient leaves the clinic, physicians will then stay connected with patients through social networking and e-visits. Physicians will even receive feedback on their patient’s lives through algorithms that will process each patient’s data trail: how often they are picking up prescriptions, how frequently they are taking a walk, how many times they buy cigarettes in a month. And of course, computers will probably even make diagnoses some day, as IBM’s Watson or the Isabel app aspire to do.

Yet even if Watson or Isabel succeeds in skilled clinical diagnosis, these technologies will not render physicians obsolete. No matter how much we digitize our clinical and educational experiences, humans will still crave the contact of other humans. We might someday completely trust a computer to diagnose breast cancer for us, but would anyone want a computer to break the bad news to our families? Surgical robots might someday drive themselves, but will experienced surgeons and patients accede ultimate surgical authority to a machine? A computer program might automatically track our caloric intake and physical activities, but nothing will replace a motivating human coach.

With all of these changes, faculty will presumably find time for our oft-neglected values. Bedside teaching will experience a renaissance and will focus on skilled communication. Because the Google Generation of residents and students will hold all of the world’s knowledge in the palm of their hands, they will look to faculty to be expert role models. Our medical educators will be able to create a truly streamlined, ultra-efficient learning experience that allows more face-to-face experiences with patients and trainees alike.

So where is academic medicine headed beyond Flexner? Academic physicians will remain master artists, compassionate advisers, and a human face for the increasingly digitized medical experience.

Resident Blogger

Priya Umapathi, M.D.

Resident Blogger

NEJM Journal Watch General Medicine

Learn more about Insights on Residency Training.