Peer Review

Saturday, Oct 01, 2011

Over the past 12 years, anesthesiologist Scott Reuben​ revolutionized the way physicians provide pain relief to patients undergoing orthopedic surgery for everything from torn ligaments to worn-out hips. Now, the profession is in shambles after an investigation revealed that at least 21 of Reuben's papers were pure fiction, and that the pain drugs he touted in them may have slowed postoperative healing.

Paul White, another editor at the journal, estimates that Reuben's studies led to the sale of billions of dollars worth of the potentially dangerous drugs known as COX2 inhibitors, Pfizer's Celebrex (celecoxib) and Merck's Vioxx (rofecoxib), for applications whose therapeutic benefits are now in question. Reuben was a member of Pfizer's speaker's bureau and received five independent research grants from the company.

A 2007 editorial in Anesthesia & Analgesia stated that Reuben had been at the "forefront of redesigning pain management protocols" through his "carefully planned" and "meticulously documented" studies... In 2004, Vioxx and Bextra were pulled from the market because of their link to an increased risk of heart attacks and strokes, leaving Pfizer's Celebrex as the only COX2 inhibitor available. Celebrex sales plunged 40 percent after a study that same year suggesting that it, too, posed a heart attack risk. Despite this, Reuben continued to present "findings" in research funded by Pfizer that trumpeted Celebrex's alleged benefits and downplayed its potential negative side effects.

The question is: Why did it take 12 years before a "routine audit" revealed Reuben's widespread data fabrication? "Baystate publishes about 200 [studies] every year, and of those [articles], the audit rate might only be 5 percent."

A Medical Madoff: Anesthesiologist Faked Data in 21 Studies, Brendan Borrell, March 10, 2009,

How can two (reasonably) well-regarded organisations peer review the same work -- Ewen and Pusztai's research on the effects of feeding genetically modified potatoes to rats -- and yet come to such radically opposite conclusions about its validity, as did the Royal Society and The Lancet? All six Royal Society reviewers pronounced the research "flawed", while five out of six of The Lancet's reviewers judged that Ewen and Pusztai's work should be published. Peer review as a reliable technique for assessing the validity of scientific data is surely discredited.

The mistake, of course, is to have thought that peer review was any more than a crude means of discovering the acceptability -- not the validity -- of a new finding. Editors and scientists alike insist on the pivotal importance of peer review. We portray peer review to the public as a quasi-sacred process that helps to make science our most objective truth teller. But we know that the system of peer review is biased, unjust, unaccountable, incomplete, easily fixed, often insulting, usually ignorant, occasionally foolish, and frequently wrong. A recent editorial in Nature was right to conclude that an over-reliance on peer-reviewed publication "has disadvantages that should be countered by adequate provision of time and resources for independent assessment..."

Genetically modified food: consternation, confusion, and crack-up, Richard Horton, Editor, The Lancet, The Medical Journal of Australia, 2000,

One suspects that peer review is a bit like democracy - a bad system but the best one possible. It seems to be one that takes different forms in different (scientific) cultures and can be tweaked to improve its operation. Let us hope that future research will discover and disseminate the best ways to fine-tune the system within the constraints of each type of journal.

Quality and value: How can we research peer review?, Joan E. Sieber, Nature, 2006,

Why Most Published Research Findings Are False