February 27th, 2014

Poor Medical Research: What Has Changed in 20 Years?

In a recent BMJ.com blog, Richard Smith, former Editor of the BMJ, comments on the Lancet series on waste in research and refers to an editorial by statistician Doug Altman – published 20 years ago. I wanted to quote from the Smith blog (which quotes from the Altman editorial) and get some discussion going on our site. I found it to be quite profound.

“Why, asked Altman, is so much research poor? Because ‘researchers feel compelled for career reasons to carry out research that they are ill equipped to perform, and nobody stops them.’ In other words, too much medical research was conducted by amateurs who were required to do some research in order to progress in their medical careers. Ethics committees, who had to approve research, were ill equipped to detect scientific flaws, and the flaws were eventually detected by statisticians, like Altman, working as firefighters. Quality assurance should be built in at the beginning of research not the end, particularly as many journals lacked statistical skills and simply went ahead and published misleading research.

‘The poor quality of much medical research is widely acknowledged,’ wrote Altman, ‘yet disturbingly the leaders of the medical profession seem only minimally concerned about the problem and make no apparent efforts to find a solution.’

Altman’s conclusion was: ‘We need less research, better research, and research done for the right reasons. Abandoning using the number of publications as a measure of ability would be a start.’”

What do you think?

10 Responses to “Poor Medical Research: What Has Changed in 20 Years?”

  1. Tina Dobsevage, MD says:

    I agree with Altman’s conclusion that “abandoning the number of publications as a measure of ability would be a start”. Much of what I read is either poorly done research or rehash of research done decades ago. Or articles that present the obvious as something new.

    Magazines need articles, academic doctors need to publish. It’s a vicious circle.

  2. Shaumik Adhya, MBBS BSc MRCP CCDS says:

    I’m going through the process of writing up my thesis. I entered research to further my career. Instead I received minimal research training, rudimentary supervision and my work is pretty poor scientifically. The process has taught me that and it is invaluable to teach a healthy degree of skepticism but I can’t help thinking there has to be a more efficient way of doing things! I know the research landscape is changing but I got caught out.

  3. William Burke, MD says:

    It surely is a part of the problem. How much is hard to assess. The balance between the importance of clinical skills and research abilities have changed greatly over the last 50 years. Ironically academic medicine may have been caught in the trap of perish by publishing. Altman makes an strong case for rethinking our current course.

  4. Gervasio Antonio Lamas, MD says:

    Bemoaning the quantity of research is a bit like bemoaning drunkenness after a wine tasting. We asked for it!

    There are many reasons for the proliferation of research of somewhat mediocre quality.

    1] To start, fellows are required to produce research. So what kind of quality can hundreds or thousands of fellows produce with limited time and mostly no funding. When we evaluate residents for fellowship (I was a PD for 20 years) – you always comment if the CV is too thin, right?

    2] ACGME requires academic activity – what is this but research?

    3] 30 years ago only a mainframe could do multivariable analyses – my first year fellows do them on a laptop now.

    4] the more research there is the more journals there will be. Especially with open access journals.

    5] and what about faculty advancement?

    And, of course, as I have previously bemoaned – good research can be hard to recognize by the mandarins in charge.

    So I am not too sympathetic to this dilemma we have caused.

  5. Enrique Guadiana, Cardiology says:

    I agree 100% with the conclusion of Dr. Altman. I just want to point out a different aspect related to this problem. To conduct a research this days not only you have to be professional you need a great deal of money and resources. This elements are limited and under the control of few people (government, private industry, etc). This combination of ingredients draw in politics, conflict of interest, corruption, abuse, etc. in the research community. Form many the research field it has become a private club a royal court full of intrigues. The only way to fix it, is with the democratization of the field, to be inclusive and tolerant and must of all transparent.

  6. the research can never be as AWE INSPIRING, as it was 20-30 yrs ago,because
    1. when u don’t have any treatment, any success is welcome – the first beta blocker may have been approved with a small trial population of 100 or more, any new additional beta blocker will need a trial of more than 1 million subject, will have to show statistically significant results – unmatchable saftey. still may not reap dividends. most of the break thru are classified as Orphan drugs.

    2. Most of the Basic health concern’s of developed countries (the revenue generators) are fulfilled as on this date.

    3.as rightly said the earlier scientists didn’t took research for career advancement,today we differ.


  7. Beat J. Meyer, M.D. says:

    As long as the academic community continuous to produce waste in biomedical research, the natural consequences are relentless efforts to improve its quality. At a personal level honesty and integrity should simply come first when focusing on one’s scientific career.

    What can be done by the research community and by readers? In a recent comment by Sabine Kleinert and Richard Horton, the Lancet editors asked a broader question on “how should the entire scientific enterprise change to produce reliable and accessible evidence that addresses the challenges faced by society and the individuals who make up those societies?”

    The Lancet editors specifically addressed the provocative views of Randy Schekman – the Nobel Prize Winner in Physiology in December 2013 – on prestigious journals and how “aggressively they curate their brands with a gimmick called impact factor“ . Sheckman attacked Nature, Science, and Cell by calling them “luxury journals” and accusing them of damaging science by not publishing outstanding research rigorously enough. Schekman and his lab are even boycotting those “luxury journals” and he is encouraging other scientists to do the same according to the Lancet.

    Although The Lancet, JAMA, or The New England of Medicine were not specifically mentioned in Schekman’s journal category, Richard Horton opened up a discussion in his journal by launching a Series of 5 papers entitled “Increasing Value, Reducing Waste” in biomedical research.

    It turned out to be a thorough analysis of the most important quality issues in biomedical research today with solid recommendations on how to increase value and reduce disproportional waste and on how to monitor the implementation of these recommendations. Let’s move on in these directions!

  8. In academic research, as Winston Churchill said (about things in general), “Success is the ability to go from one failure to another with no loss of enthusiasm.” The leaders of academic medicine use crude metrics to gauge the worth of research faculty, such as the number of papers published and amount of grant funding, without due consideration of the quality or importance of research to humanity. Teamwork and sharing are strongly disincentivized by the narrow-minded, bottom-line mentality that prevails among senior academic leaders. As with virtually all problems within organizations (per W. Edwards Deming), the problem of poor research quality is due to a lack of effective leadership vision and execution. The individual researchers are only part of the problem. Building big buildings that “bring people from different labs together” is not my idea of visionary leadership. The culture and financing of academia must truly support multidisciplinary teamwork if research quality is to improve.

  9. Brian Scanlan, MD says:

    Is it time for medical researchers to learn how to play together? Try crowd-sourcing.

  10. I agree completely with Altman, but I would go beyond his conclusions. We, as scientists, are responsible for ensuring that published research is important, well-designed, correctly analyzed, and accurately interpreted, but our scientific community has done a poor job for a very long time. In fact, Altman’s critique applies not only to biomedical research, but to research across many other fields as well: psychology, physics, chemistry, agriculture, and many other disciplines.

    The problems that lead to poor research are legion. In addition to inadequate research training, we have the “publish or perish” rule that applies to all academic fields. When evaluating a faculty member for tenure and promotion, what is the first – and sometimes only – thing we look at? Research, of course. And how do we evaluate it? Often, by counting publications and maybe determining the quality of the journals and the amount of funding. In my department, we have tried to go beyond that, but not with full success. We establish a review committee of at least three senior faculty, who are charged with reading everything the candidate has published (not just counting), soliciting outside reviews by senior experts who do research closely related to that of the candidate, and writing a detailed report. That report goes to the entire senior faculty, who then meet, discuss, and vote. Even then, I’ve seen cases where we fail to make a good assessment and other cases where the vote leads to tenuring a borderline faculty member who should never have been tenured; we hate to terminate a colleague.

    What can we do to improve the quality of the research that gets published? We can insist on better research training, more carefully evaluate academicians when tenuring and promoting them, and be more careful as journal reviewers. Don’t recommend publication of a manuscript that has notable flaws, no matter how interesting the study might be. We have to remember that the very appropriate demand in recent times has been to employ only evidence-based treatments. However, if the evidence we rely on is flawed, what good is evidence-based treatment? As journal editors, we can also be very careful in selecting reviewers for any given manuscript. For example, I was recently asked by an optometry journal to review an article, and my publications are all in a completely different field that has absolutely nothing to do with vision in any way. Not a good choice!

    I would like to add to these considerations that the issue of the quality of research published is likely related to the issue of falsifying research that we discussed recently. The same factors – publish or perish, inadequate journal reviews, inadequate evaluation of faculty, and others – allow far too much room for falsifying research, thereby adding to the problem of honest but flawed research.

    Finally, I would add a caveat concerning research quality: Historically, we have seen that sometimes the early research relevant to what eventually becomes an important treatment is poor. However, it provides a clue that leads other researchers to conduct better research, and the treatment turns out to save lives. This is always going to be a tough judgment call, but, as the old saying goes, we have to be careful not to “throw out the baby with the bath water.”