The Perception Lab is committed to implementing as many open science practices as possible in our research. Below, you will find information about what this looks like for the Perception Lab, and resources to learn more about open science.

What We Do

As a lab, we create an Open Science Framework (OSF) page for each of our projects. Our OSF pages include the preregistration for each study, publicly available data and analysis code, and when possible, a copy of our experiment. We also publish preprints of our papers on PsyArXiv.

Our lab is a member of the Psychological Science Accelerator, and we regularly try to include replications of previous data into our studies.

Julia teaches a course called Psychology’s Credibility Revolution that is fully accessible to the public and addresses many issues related to open science

So you’re thinking about preregistering your research?

Preregistering is a fantastic first step to implement open science practices into your research. Preregistering means that you make all the decisions about your experiment (sample size, exclusion criteria, analyses, etc.) before you start data collection. This is supposed force researchers to think through their research before conducting it, and hopefully lower the number of Questionable Research Practices (QRPs) that can accidentally happen. Read more about QRPs.

If you want to learn more about preregistering, check out this APS article from 2016. It walks through some problems that not preregistering can lead to, how preregistering can begin to address those problems, a brief explanation about the types of preregistration, and some FAQs. The Center for Open Science (COS) also published a list of questions and answers relating to preregistration.

As mentioned above, we preregister each of our studies. Here are a few examples:

Open Science Resources

Want more information on open science, specifically psychological open science? Here you will find additional resources, including journal articles, articles in newspapers and other websites, and some open science organizations and groups.


  • Flake, J. K., & Fried, E. I. (2020). Measurement Schmeasurement: Questionable Measurement Practices and How to Avoid Them. Advances in Methods and Practices in Psychological Science. Link.
  • Flake, J. K., Pek, J., & Hehman, E. (2017). Construct validation in social and personality research: Current practice and recommendations. Social Psychological and Personality Science, 8(4), 370–378. Link.
  • Naro, M. (2016, October 6). Repeat After Me. Cartoon published in The Nib. Link.
  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. Link.
  • R. (2015, January 24). Questionable Research Practices: Definition, detect, and recommendations for better practices. Link.
  • Resnick, B. (2017, July 31). What a nerdy debate about p-values shows about science — and how to fix it. Article published in Vox. Link.
  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366. Link.
  • Simons, D. J. (2014). The value of direct replication. Perspectives on Psychological Science: A Journal of the Association for Psychological Science, 9(1), 76–80. Link.

Other Resources: