Symposia
Research Methods and Statistics
Cassandra M. Brandes, M.A.
Northwestern University
Evanston, Illinois
Christopher Hopwood, PhD
Professor
University of Zurich
Zurich, Zurich, Switzerland
Aleksandra Kaurin, PhD
Assistant Professor
Private Universität Witten/Herdecke gGmbH
Witten, Nordrhein-Westfalen, Germany
Kevin King, PhD
Associate Professor
University of Washington
Seattle, Washington
Solomon Kurz, PhD
Clinical Research Psychologist
Central Texas Veterans Healthcare System
Waco, Texas
Priscilla Lui, PhD
Assistant Professor
Southern Methodist University
Dallas, Texas
Shirley B. Wang, M.A.
PhD Candidate
Harvard University
Cambridge, Massachusetts
Rowan Hunt, BA
PhD student
University of Louisville
Louisville, Kentucky
Olivia Kirtley, PhD
Senior research fellow
KU Leuven
Leuven, Brabant Wallon, Belgium
Jennifer Tackett, PhD
Professor
Northwestern University
Evanston, Illinois
In the past decade, psychological scientists have embarked on an unprecedented period of scientific self-examination and methodological advancement referred to as “the replication crisis” (Pashler & Wagenmakers, 2012), “psychology’s Renaissance,” (Nelson et al., 2018), and the “credibility revolution” (Vazire, 2018). By any name, the finding that many influential psychological studies do not replicate has led many scientists to reconsider how we conduct our research. In the wake of this movement, open science methods have been developed with the aim of augmenting the verisimilitude of research findings. However, these methods have not seen widespread implementation (Hardwicke et al., 2020); there has been both significant between- and within-subfield variability in the degree to which applied researchers engage with open science methods. This implementation gap is particularly evident in subfields such as clinical psychology.
In this talk, I discuss potential contributors to this spotty uptake of open science methods and offer suggestions for addressing them. Scientific reform efforts are, themselves, behavioral modifications, and as with other interventions (e.g., therapy), education alone is often insufficient for affecting change. Because of its extensive focus on best practices in behavioral intervention, we maintain that lessons from clinical psychological research could greatly enhance scientific reform efforts. Drawing from metascientific studies, our observations, and the clinical intervention literature, my coauthors and I consider how scientists’ ambivalence to changing their methods may reflect both structural barriers within the academy and suboptimal communication strategies by methodologists. Finally, we offer a list of concrete recommendations for addressing these challenges based in intervention science. These suggestions include defining and assessing open science outcomes more systematically, adopting a perspective of cultural humility, attending to the working alliance between methodologists and applied researchers, acknowledging ambivalence to changing research practices, and facilitating mastery of open science tools.