Science's greatest strength is its capacity to correct its mistakes, but it doesn't always happen. Professor Jodi Schneider is trying to bringing the failures to the surface, making an example of a paper that was retracted because of fraud yet has been cited so often that it's contaminating the field with its falsehood.
“A scientific paper that is right is like a brick to build walls of evidence we can rely on,” Schneider said in a statement. Draw your own conclusions about the implications of the wrong ones.
Schneider is assistant professor of information sciences at the University of Illinois Urbana-Champaign, but the paper she is referring to is a medical one. Initially published in Clinical Investigations COPD in 2005, the paper concluded omega-3 fatty acids reduce measures of inflammation in patients with chronic obstructive pulmonary disease (COPD).
With COPD being the world's third-biggest killer (in ordinary years), a widely available dietary supplement could save many lives if it truly worked. However, in 2008, the first author's employer found he'd made up the data that this – and many other papers – relied on. Sadly, that hasn't prevented the paper from being treated as a reliable source of information.
Schneider found 148 direct citations in other scientific papers, of which 112 were published post retraction. In 41 percent of cases, the original work was described in detail, yet only five of these citations mentioned the retraction, all in papers critical of its conclusions. These papers were in turn cited by 2542 others.
Almost 150 citations isn't the scientific equivalent of a gold record but put it far above the average – exceeding the most successful of many good scientists' papers.
Even longer-lasting errors have been found, but these were from a time before online publication when opportunities to alert readers to retractions were more limited.
Unfortunately, Schneider concludes in Scientometrics, it's probably not an isolated case. Retracted articles are a “Really really tiny” proportion of publications she said. “It’s a regular occurrence that things get retracted, but people publish so much.” In the last eight years there have been four retractions per 10,000 papers, Schneider found, although the proportion is higher in medicine. With millions of scientific publications each year, that still amounts to a significant number.
“It’s really rare for any journal to be scanning reference lists,” Schneider added, which means there could be other examples of misleading papers polluting the pool of knowledge.
The most famous example of a retracted paper in recent years was Andrew Wakefield's hit job on the MMR vaccine. It's easy to imagine that most were similarly withdrawn because of outright fraud. However, honest mistakes can lead to retractions as well, whether in faulty data collection or from flaws in the analysis. “Part of the challenge is understanding the reasons why a particular article was retracted, which often are really vague,” Schneider said.
Students in Schneider's department built a tool called ReTracker to automatically check for retractions and alert scholars. In addition to its use, Schneider proposes standardization of responses when retracted papers have been cited, rather than the existing mess.