Saturday, July 10, 2021

Most Scientists Don't Follow Formal Evidence Standards, Unlike Judges

The www.realclearscience.com site is a typical "science news" site: a strange mixture of hard fact, speculations, often-dubious opinions, spin, clickbait, hype and corporate propaganda, all under the banner of "science."  I noticed an enormous contrast between one of the site's articles appearing yesterday, and another article appearing today.

The link that appeared yesterday was a link to a very give-you-the-wrong-idea article by scientist Adam Frank, one with the swaggering title, "The most important boring idea in the universe."  This idea that Frank says is so important is the claim that "scientific knowledge" rests upon "mutually agreed standards of evidence." 

Frank attempts to persuade us that after arguing for a long time, scientists agreed on "standards of evidence" that they are now faithfully following. He writes the following:

"There were lots of wrong turns in figuring out what counted as meaningful evidence and what was just another way of getting fooled. But over time, people figured out that there were standards for how to set up an experiment, how to collect data from it, and how to interpret that data. These standards now include things like isolating the experimental apparatus from spurious environmental effects, understanding how data collection devices respond to inputs, and accounting for systematic errors in analyzing the data. There are, of course, many more."

The idea that Frank tries to plant is a false one. Scientists never agreed upon some "standard of evidence" that would be used in judging how experiments or observations should be done or whether scientific papers should be published or publicized.  There is no formal written "standard of evidence" used by scientists. Conversely, courts do actually make use of formal written standards of evidence. 

When you go to www.rulesofevidence.org, you will find the Federal Rules of Evidence used in US federal courts.  The page here lists about 68 numbered rules of evidence used in this evidence standard. Here are some examples:

  • Rule 404: "Evidence of a person’s character or character trait is not admissible to prove that on a particular occasion the person acted in accordance with the character or trait."  (There are quite a few exceptions listed.) 
  • Rule 608: " A witness’s credibility may be attacked or supported by testimony about the witness’s reputation for having a character for truthfulness or untruthfulness, or by testimony in the form of an opinion about that character. But evidence of truthful character is admissible only after the witness’s character for truthfulness has been attacked."
  • Rule 610: "Evidence of a witness’s religious beliefs or opinions is not admissible to attack or support the witness’s credibility." 

There are more than 60 other rules in the Federal Rules of Evidence. US Federal Courts have a formal written set of evidence standards. But scientists have no such thing.  The impression that Frank has attempted to insinuate (that scientists operate under formal standards of evidence that they carefully worked out after long debate) is not correct.

There are no formal detailed written evidence standards in any of the main branches of science.  In biology, poorly designed experiments following bad practices are extremely common.  In theoretical biology and physics, it is extremely common for scientists to publish papers based on the flimsiest or wildest of speculations. When we read scientific papers such as those speculating about a multiverse consisting of many unobserved universes, we are obviously reading papers written by authors following no standards of evidence at all. It's pretty much the same for any of the thousands of papers that have been written about never-actually-observed things such as abiogenesis, dark matter, dark energy or primordial cosmic inflation.

In fields such as paleontology, elaborate speculation papers can be based on the flimsiest piece of ancient matter or the tiniest bone fragment; and many papers in that field are not based on specific fossils.  Then there are endless chemistry papers not based on actual physical experiments but on "chemical reactions" merely occuring on paper, a blackboard, or inside a computer program. Countless papers in many fields are based on mere computer simulations or abstruse speculative math rather than physical experiments or observations. 

On the next day after the www.realclearscience.com site published a link to Frank's article, it published a link to an article that very much contradicted his insinuations that scientists are adhering to sound standards of evidence. The link was to an article on www.reason.com entitled "How Much Scientific Research Is Actually Fraudulent?"

Here are some quotes from the article:

"Fraud may be rampant in biomedical research. My 2016 article 'Broken Science' pointed to a variety of factors as explanations for why the results of a huge proportion of scientific studies were apparently generating false-positive results that could not be replicated by other researchers. A false positive in scientific research occurs when there is statistically significant evidence for something that isn't real (e.g., a drug cures an illness when it actually does not). The factors considered included issues like publication bias, and statistical chicanery associated with p-hacking, HARKing, and underpowered studies....A 2015 editorial in The Lancet observed that 'much of the scientific literature, perhaps half, may simply be untrue.' A 2015 British Academy of Medical Sciences report suggested that the false discovery rate in some areas of biomedicine could be as high as 69 percent. In an email exchange with me, Ioannidis estimated that the nonreplication rates in biomedical observational and preclinical studies could be as high as 90 percent....Summarizing their results, an article in Science notes, 'More than half of Dutch scientists regularly engage in questionable research practices, such as hiding flaws in their research design or selectively citing literature. And one in 12 [8 percent] admitted to committing a more serious form of research misconduct within the past 3 years: the fabrication or falsification of research results.' Daniele Fanelli, a research ethicist at the London School of Economics, tells Science that 51 percent of researchers admitting to questionable research practices 'could still be an underestimate.' "

Such comments are consistent with my own frequent examination of neuroscience research papers. When examining such papers, I seem to find that Questionable Research Practices were used most of the time. Almost always, the papers include study group sizes that are less than the reasonable standard of having at least 15 subjects in every study group, meaning there is a high chance of a false alarm. Most of the times, the papers fail to show evidence that any blinding protocol was used. The detailed elucidation and following of a rigorous blinding protocol is an essential for almost any experimental neuroscience study to be regarded as reliable. Few papers follow the standard of pre-registering a hypothesis and methods for data gathering and analysis, leaving the researchers free to follow an approach rather like "torture the data until it confesses" to what the researcher is hoping to find. 

torture data until it confesses

What this means is that the great majority of times you read about some neuroscience research on some science news site, you are reading about an unreliable result that should not be taken as robust evidence of anything. 

bad neuroscience practices


Frank mentioned "best practices," trying to insinuate that scientists follow such practices. He fails to tell us about the large fraction of scientists that follow shoddy practices.  Frank attempted to portray scientists as "follow strictly the good rules" guys acting like judges in a court. But it seems that a large fraction of scientists are like cowboys in the Wild West pretty much doing whatever they fancy.  And so many of the gun blasts from such cowboys are just noise. 

No comments:

Post a Comment