Tuesday, January 28, 2014

Oct. 2013 Economist article on Problems with scientific research - How science goes wrong

Last updated on January 29th 2014

Here's an interesting Oct. 2013 Economist article, Problems with scientific research - How science goes wrong.

Some notes and comments of mine:

The article mentions that some research groups of a biotech firm and a drug company could reproduce/repeat only a small portion of important research papers.

The number of scientists has increased from a few hundred thousand in the 1950s to 6-7 million which has impacted quality of research. [Ravi: That's an increase of over ten times (presuming that few is less than 6). Add "Publish or perish", which the article goes on to mention, and we can see the factors behind large number of  papers of dubious quality being published as "research".]

"Nowadays verification (the replication of other people’s results) does little to advance a researcher’s career. And without verification, dubious findings live on to mislead." [Ravi: Vital point - I did not know about this/realize this. But, as I think about it, I don't think I have heard too much about Computer Science or Information Technology papers challenging the findings/results of other papers. But then I do not read academic publications typically - I usually get to know of research articles in the academic/professional area of my interest ("practice of software development" in academia and industry) from others.]

The article casts some doubt on the effectiveness of the (academic) peer review process and gives an example where many medical journal reviewers failed to catch deliberately inserted errors even though the reviewers were aware they were being tested. [Ravi: I am so happy to see the Economist criticize the academic peer review process. I have undergone and participated (as a reviewer) in fair amount of fairly decent peer review in the software industry of designs, specifications as well as code. In contrast, most of the very little peer review that I have experienced in academic papers can be categorized as decent, or rather biased (in my view), or very superficial. I certainly am not that impressed by the little academic peer review process that I have experienced. There is a great lack of transparency in the review process which, to my mind, allows a reviewer to be unaccountable for his/her review remarks. In contrast, in the industry peer reviews that I have been involved with, the detailed review remarks were available for many interested people to view, and sometimes even archived along with various versions of the input documents/code. One could see the effect of the review process by examining the pre and post versions of the documents/code (with sometimes multiple round of reviews till the document/code met the required quality). The reviewer was accountable for his/her review remarks - I mean, outrageous review remarks could easily be passed on to neutral knowledge-authority figures who could question the reviewer.]

It suggests monitoring of research in "virtual notebooks" and making data available to other researchers. [Ravi: From what I have seen of academic research I think the fear of somebody else stealing one's ideas and techniques, and beating one to the research publication may come in the way of transparent reporting of research work prior to paper publication. But, in many cases, data can certainly be shared once the paper is published. However, such data (including software design and code for CS and IT work) sharing does not seem to be required or the norm (there are some exceptions, I am given to understand). Without the data it becomes very hard for others to verify the results. So I think some people can get away with making false claims about results in their research publications.]

It suggests improving of the peer review process or its replacement by "post-publication evaluation in the form of appended comments" and states that the latter has worked well for some fields in the recent past. [Ravi: I think review remarks should be published along with pre and post versions of the paper. Further, when the reviewers are willing to do so, they should mention their name as well and get credit for their review remarks (or get criticized for poor review remarks). In my experience in the software industry, reviews were not blind - yes, that led to some discomfort at times but one got over it. After all the objective is to use the experience of a knowledgeable group of people to improve specifications, design and code. Part of the process of maturity as a software development practitioner is to learn to not only accept good criticism but value it as it contributes to improving one's work and catching flaws early (thereby preventing or controlling the damage these flaws cause/can cause).]

"The false trails laid down by shoddy research are an unforgivable barrier to understanding." [Ravi: This last sentence of the article is a strong one, which seems to be correct. I guess the right word is pollution. Shoddy research papers pollute research publications with the good research papers getting mixed up with the bad. And, if I have understood matters correctly, these publications are viewed as capturing the state of the frontiers of knowledge of various streams. That makes it a serious problem.]

---------------------------

Later I started going through the comments on the article webpage. Some of them are rather shocking but some suggest solutions. I think it is really worthwhile for those who have not read up on such matters (like me) to have a look at the comments to get a seemingly truthful inside-view of the state of science research (publications) today. I particularly liked the following comments:

  • CnKQ7pSia6Oct 17th 2013, 19:30 [Details of an alleged bad medical science experience]
  • chriffOct 17th 2013, 20:58 [Short allegation of bad science involving a Nobel laureate]
  • OhioOct 17th 2013, 16:31 [This one suggests a solution for part of the problem (validation)]
  • HangyaOct 18th 2013, 15:24 [This one comes across to me as a balanced and wise comment]

No comments:

Post a Comment