In Research We Trust

“Facts are stubborn things, but statistics are pliable.”
Mark Twain

Anyone that knows me knows I believe in research and data backed decisions in education.  Successful research is a balancing act between skepticism and an openness to new sometimes radical ideas.  To avoid the possibility of bias, we have developed methodologies and techniques to determine the validity of an experiment.  Experimental validity falls into two categories: internal, experimental design, data collection, and data analysis. The second is external, the progression from hypothesis to theory, and finally to the fact.  Research drives the progression from hypothesis to fact with supporting evidence and replication.

Considering how vital replication is to research, there appears to be very little direct replication.  Makel and Plucker showed that only 0.13% of educational research is replicated (Facts Are More Important Than Novelty: Replication in the Education Sciences).  Compared to a rate of 1.07% in psychology and 1.2% for marketing research.  However, the rate of replication does not tell the whole story.  After all, to publish research, you need to conduct an experiment, submit it for peer review, make changes, and then have your article published.  Perhaps we can accept published results.

Looking at actual replication studies suggests that publication is not enough.  One study in psychology, Estimating the reproducibility of psychological science, was only able to replicate 63% of the studies they examined.  Replications of clinical research are even worse.  A group from Amgen attempted to replicate 53 research studies in cancer research they only replicated 6 of them.  Additionally, a group with Bayer Health could only replicate 25% of the preclinical studies they tested (Drug development: Raise standards for preclinical cancer research). 

So how do we resolve the replication crisis?  We need to reproduce previous research and publish the results.  The problem is that professors, postdocs, and graduate students don’t benefit from replication studies.  Even if researchers get the articles published, they don’t carry the same weight as original research.  One possibility would be to have graduate students replicate experiments at the beginning of their graduate study as part of their training.  However, this is probably not a workable solution as it would likely lengthen the time to degree. 

So, who would benefit from reproducing research?  The answer is undergraduates.  Conducting replication studies would more effectively train students in research methodologies than any amount of reading.  Why would conducting replication studies help students with research design?  The reason is that if you replicate a study perfectly (exactly as undertaken previously), you might have the same problems the original researchers had.  After all, most of the issues in research are not intentional but unintentional and probably unidentifiable problems with data collection or analysis.

Statistical analysis of most data involves a null hypothesis.  When the data is analyzed, the null hypothesis is either accepted or rejected.  Errors analyzing a null hypothesis, are classified as Type I (rejecting a correct null hypothesis) or Type II (accepting a false null hypothesis).  The critical thing to keep in mind is that it is impossible to eliminate Type I and II errors.  Why can’t researchers eliminate Type I and II errors? Think about a P value, P < 0.001, what does the number mean.  Written in sentence form as P value < 0.001 means: the likely hood that these results are the product of random chance is less than 1 in 1000.  While this is a small number, it is not zero, so there is still a tiny chance that the results are due to random chance. Since P values never become P < 0, there is always a chance (sometimes ridiculously small) that results are due to random chance.

In addition to Type I and II errors, there could be problems with sample selection or size. Especially early in the research were influencing and masking factors might not be known.  Alternatively, limited availability of subjects could lead to sample size or selection bias.  All these factors mean that a useful replication study looks at the same hypothesis and null hypotheses but uses similar but not identical research methods.

Beyond the benefits students would gain in experimental design, they would also learn from hands-on research something that many groups say is important for proper education.  Additionally, replication research is not limited to biology, chemistry, and physics.  Any field that publishes research (i.e., most areas of study) can take part in undergraduate replication research.

Of course, these replication studies will only benefit research if they are published.  We need journals to publish replication studies, how do we do that.  Should a portion of all journals be devoted to replication studies?  The Journal Nature says it wants to publish replication studies; “We welcome, and will be glad to help disseminate, results that explore the validity of key publications, including our own.” (Go forth and replicate!).  Hay Nature how about really getting behind replication studies! How about adding a new Journal to your stable, Nature: Replication?

However, if we want to disseminate undergraduate replication studies, it may be necessary to create a new Journal, The Journal of Replication Studies?  With all the tools for web publishing and e-Magazines, it should be straight forward (I didn’t say free or cheap) to create a fully online peer-reviewed journal devoted to replication.  Like so many issues, the replication crisis is not a problem but an opportunity.  Investing in a framework that allows undergraduate to conduct and publish replication research will help everyone.

Thanks for Listing to My Musings
The Teaching Cyborg

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s