Obsedenost z indeksi ubija znanost

Leta 1963 fizik in zgodovinar znanosti Derek de Solla Price preučeval trende naraščanja znanstvenih objav in ugotavljal, da bo prej ali slej prišlo z zloma sistema. Število znanstvenikov in njihovih del je eksponentno naraščalo že 250 let, kar se mu je zdelo nevzdržno. Napovedal je, da bomo čez nekaj generacij prišli do stanja, ko bomo imeli v povprečju po dva znanstvenika na družino. Že takrat je zaključil, da se bo morala znanost iz stanja eksponentne rasti preoblikovati v nekaj radikalno drugačnega in novega, če bo želela še naprej ohraniti status in ugled, ki si ga je skozi stoletja ustvarila v družbi.

V komentarju The pressure to publish pushes down quality v reviji Nature, se Daniel Sarewitzki sprašuje, če bo treba začeti znanstvenike spodbujati, da objavljajo manj. Preprosta dostopnost bibliometričnih podatkov je namreč povzročila, da so znanstveniki (in njihovi financerji oziroma delodajalci) postali obsedeni z numeričnimi prikazi produktivnosti in impakta, kar pomeni, da se neprestano medsebojno primerjajo in vrednotijo le še preko teh podatkov. (Pri nas smo prišli celo tako daleč, da se nam zdi dopustno v obrazložitvi podelitve Zoisove nagrade, ki je najvišje državno priznanje za znanstvenike, kot utemeljitev izbire zapisati, da ima nagrajenec veliko Cobiss točk ali visok h indeks!)

The quality problem has reared its head in ways that Price could not have anticipated. Mainstream scientific leaders increasingly accept that large bodies of published research are unreliable. But what seems to have escaped general notice is a destructive feedback between the production of poor-quality science, the responsibility to cite previous work and the compulsion to publish.

The quality problem has been widely recognized in cancer science, in which many cell lines used for research turn out to be contaminated. For example, a breast-cancer cell line used in more than 1,000 published studies actually turned out to have been a melanoma cell line. The average biomedical research paper gets cited between 10 and 20 times in 5 years, and as many as one-third of all cell lines used in research are thought to be contaminated, so the arithmetic is easy enough to do: by one estimate, 10,000 published papers a year cite work based on contaminated cancer cell lines. Metastasis has spread to the cancer literature.

… So yes, the web makes it much more efficient to identify relevant published studies, but it also makes it that much easier to troll for supporting papers, whether or not they are any good. No wonder citation rates are going up.

That problem is likely to be worse in policy-relevant fields such as nutrition, education, epidemiology and economics, in which the science is often uncertain and the societal stakes can be high. The never-ending debates about the health effects of dietary salt, or how to structure foreign aid, or measure ecosystem services, are typical of areas in which copious peer-reviewed support can be found for whatever position one wants to take — a condition that then justifies calls for still more research.

(Vir: The pressure to publish pushes down quality : Nature News & Comment)

-
Podpri Kvarkadabro!
Naroči se
Obveščaj me
guest

0 - št. komentarjev
Inline Feedbacks
View all comments