University’s SRC researchers part of study published in Science magazine

University’s SRC researchers part of study published in Science magazine

STARKVILLE, Miss.--Mississippi State University’s Social Relations Collaborative is part of a large psychology reproducibility study published this past week in Science magazine.

The SRC, a unit of MSU’s nationally recognized Social Science Research Center, participated in the global endeavor that sought to replicate 100 findings published in three prominent psychology journals. The results of this review study appeared in the Aug. 28 issue of Science.

The collaborative is led by Colleen Sinclair, associate psychology professor, and Rebecca Goldberg, assistant counseling and educational psychology professor. They are assisted by undergraduate students Mallorie Miller, Taylor Ritchey, Emily Bullard, Jeri Champion, Mitchell Gressett. Graduate students include Sining Wu, Dominique Simmons, Jessi Dillingham and Chelsey Hess.

The study, which includes a replication conducted at Mississippi State, was conducted by 270 researchers on five continents and attempted to address one of the four tenets of the Scientific Method, reproducibility.

“I was always taught that the four central tenets of Scientific Method were falsifiability, measurability, generalizability, and reproducibility,” Sinclair said. “Neglecting the latter seems like building a table with only three legs.”

The results of the study show that the independent researchers were able to replicate less than half of the original findings. This result may call into question the validity of some scientific findings, but may also point to the difficulty of conducting effective replications and achieving reproducible results.

“We believe that replication is indeed a unique contribution to current professional literature and should be viewed as such,” said Goldberg. “There are certain journal editors who do not care to publish replication projects and certain social scientists who think that replication is unnecessary; however we stand behind reproducibility as being necessary for social science and providing unique contributions.”

The article goes beyond simply calculating an initial estimate of the rate of reproducibility in psychology. It also identifies indices by which reliability studies might be predicted; including the effect size and size of the p-value.

While less than half of the original findings were replicated, it is important to note that a failure to reproduce does not necessarily mean the original report was incorrect. These results should also not be taken as evidence of psychology as a poor science.

“Rather, the fact that we are engaging in this self-examination shows that science is working as it should,” Sinclair said. “Validation of findings should not stop at publication. We need to test, and we need to retest.”

Failure to replicate could be due to three basic reasons. First, though most replication teams worked with the original authors to use the same materials and methods, small differences in when, where or how the replication was carried out might have influenced results. Second, the replication might have failed, by chance, to detect the original result. Lastly, the original result might have been a false positive.

“Open science is critical to the future of research; our ultimate goal is to increase transparency in science and the benefits therein can have great impact on social science in particular,” Goldberg said.

The Social Relations Collaborative has been an integral part of this ongoing program studying reproducibility.  This special issue details the involvement of the SRC in the first phase of the Reproducibility Project. The Social Relations Collaborative plans to continue work to improve openness and reproducibility within psychology. The SRC has joined another collaboration with the Center for Open Science examining some of the practices certain journals have already put in place to improve transparency in research.

Also, to complement their participation in the broad-and-shallow method of testing reproducibility (i.e., many labs individually testing separate studies) employed by the present Science publication, the SRC has also joined forces with the Association for Psychological Science to use a more narrow-and-deep approach (i.e., 10 labs testing one study).

“We believe science is at its best when collaborative and open.  We look forward to further representing Mississippi State as a part of this movement to reinforce the integrity of science,” said Sinclair.

For more information on the SRC, please visit http://advancedsocialpsychlab.weebly.com/. Sinclair can be reached at 662-325-9166. Follow the lab's blog here http://advancedsocialpsychlab.weebly.com/relating-results---a-blog