March 12th, 2012
5 Ways CME Misses the Mark
Some people critique Continuous Medical Education (CME) programs as spoon-feeding. As a young physician struggling to learn a lot of material, I actually appreciate a bit of ready-to-consume content that I can learn quickly and effectively. The American College of Cardiology (ACC) and other reputable institutions provide CME programs that, overall, help to keep physicians up on the literature and integrate new findings with existing knowledge. However, despite the basic advantages of these programs, several limitations stand out to me as a frequent user of CME. Here are the key ones:
1. The actual content does not match what is advertised. Sometimes CME programs set up false expectations. For example, in a recent CME program offered by the ACC, called “Complex Cardiovascular Cases: Novel Treatment Strategies for High Risk Patients,” the flier suggested that the reader would learn about “existing HDL raising therapies to reduce cardiovascular risk in dyslipidemic patients.” I learned some useful information from that CME program, but nothing about HDL raising treatments that reduce the cardiovascular risk.
2. The evidence is reported selectively. Sometimes CME programs leave me feeling that I am receiving the extreme expert opinion rather than a balanced review of the evidence. For example, I have seen experts highlight the benefits of omega-3 supplements in improving cardiovascular outcomes, but with little discussion of well-conducted negative trials (e.g., Galan et al., Rauch et al., and Kromhout et al. to name a few).
3. The tests do not focus on matters of clinical significance. The pre-test and post-test questions on CME exams do not necessarily reflect the most fundamental learning points. I have, for instance, encountered questions that ask narrowly about “the average plasma LDL level in hospitalized CAD patients” but few that emphasize the strategies that improve cardiovascular outcomes.
4. Experts with conflicts of interest predominate. Many CME programs are prepared by faculty with strong industry ties. I do not doubt the scientific excellence of such investigators, and I am fully aware of the benefits of collaborative work with industry to generate new and applicable knowledge. Nevertheless, I think adding more panelists who don’t have direct links with industry would be helpful — much like the ACC and AHA policies for development of clinical practice guidelines.
5. The registration process is burdensome. Last, some CME programs require very detailed registration that is mandatory. Participants may use CMEs only to keep themselves up-to-date on particular medical topics, and a time-intensive registration process affects the ease of use, thereby potentially discouraging future participation.
I believe that if such issues were addressed appropriately, CME programs would be even more successful in bringing state-of-the-art information to larger numbers of clinicians.
What has been your experience with CME? In particular, have you noticed a difference between commercially sponsored CME programs and those with no direct funding from drug or device companies? Do my critiques resonate with what you have encountered?
Sorry, there are no polls available at the moment.