Is the clinical research community too prolific in publishing articles?
In the world of scientific research, the “importance” of an article that is published is traditionally measured in part by how many times it is cited in subsequent articles. Thus, in a perfectly efficient system, 100% of articles that make it to publication would be contributing useful information to the field and would be subsequently cited. However, in the cardiovascular literature at least, this seems to be far from the case.
At the 7th International Peer Review Congress held in Chicago last weekend, Isuru Ranasinghe, MD, PhD, of Yale University in New Haven, CT, presented findings from a study investigating this issue to hundreds of members of the biomedical publishing and research community. Ranasinghe and colleagues looked at more than 18 000 original articles published in 144 cardiovascular medicine journals from 2004 through 2008, and assessed the number of times each had been cited over the next 5 years after publication.
They found that about 15% of the articles (2756 total) were not cited at all in 5 years and another 33% (6122 total) were cited only 1 to 5 times over the same period. In other words, about half of the total number of articles published in cardiovascular literature were either uncited or cited less than 5 times over the 5 years following publication (8878 of 18 411 total from 2004-2008). The percentage of uncited articles per year did not significantly increase from year to year, but the overall number of the articles published did, from less than 14 000 in 2004 to 18 000 in 2008.
Ranasinghe suggested that the data may indicate a substantial amount of wasted effort, time, and funding in the world of cardiovascular research. Some audience members agreed, but others felt that not all uncited articles could be viewed as truly wasted endeavors and that the effort to add to scientific knowledge is generally more important than citations.
Categories: Journalology/Peer Review/Authorship