Written by: Gerit Pfuhl, Associate professor in psychology at UiT and NTNU.
Recent years have seen a revolution in publishing and large support for open access publishing. There has been a slower acceptance and transition to other open science principles such as open data, open materials and preregistration. To accelerate the transition and make open science the new standard, the Collaborative Replications and Education Project (CREP; http://osf.io/wfc6u/) was launched in 2013, hosted on the Open Science Framework (osf.io). CREP introduces open science at the start of academic research, facilitating student research training in open science and solidifying behavioral science results. The CREP team attempts to achieve this by inviting contributors to replicate one of several replication studies selected for scientific impact and suitability for undergraduates to complete during one academic term. Contributors follow clear protocols with students interacting with a CREP team that reviews the materials and video of the procedure to ensure quality data collection while students are learning science practices and methods. By combining multiple replications from undergraduates across the globe, the findings can be pooled to conduct meta-analysis and so contribute to generalizable and replicable research findings. CREP is careful to not interpret any single result. CREP has recently joined forces with the Psychological Science Accelerator (PsySciAcc), a globally distributed network of psychological laboratories accelerating the accumulation of reliable and generalizable results in the behavioral sciences (see Peder Isager’s article, page 61 in the printed edition). Here, I will briefly provide the rationale for CREP and describe how it is done. Hopefully, you will find a study you want to join.
We aim to facilitate student research training and solidify research findings through student participation in large-scale replication efforts.
Replications are a crucial component of the scientific method (Asendorpf et al., 2013) as well as an effective pedagogical tool (Frank & Saxe, 2012; Grahe et al., 2012). CREP combines the pedagogical benefits and replications of important findings. The studies chosen for students to replicate are selected based on interest and feasibility for undergraduate projects. Participation in these replication studies leads to recognition as contributor when publishing the results. That is, student groups share research credit. We also welcome contributors to the writing process. We follow APA guidelines for authorship, meaning we give due credit to students. The progress of the projects is monitored by the project coordinators thereby ensuring no data loss. Note, the focus is on learning how to perform transparent, replicable research, not whether an outcome is statistically significant.
The long-term goal of CREP is to contribute to the standard of psychological method training. Replications are not only necessary; they are the best way to learn how to conduct research. It is how you learn physics in school – you replicate classical experiments. We hope that fully incorporating replication studies in the curriculum becomes the new standard and habit.
As contributor to a CREP study, one is provided with the original material and often more detailed instructions than in the original paper. This is a very structured way to learn the “hows” of research without completely erasing the challenges of empirical studies. Particularly, we think this encourages the proposals of other possible factors, so called moderators or mediators, relevant to a research finding, stimulating creativity.
As a valuable by-product, CREP provides data from different labs that can be used in future meta-analyses, or for cross-cultural analyses. Beyond the pedagogical benefits, this will also contribute to more reliable data and foster theory development in psychology.
Such meta-analytical studies can be initiated by any contributor. Also, anyone can lead the analysis and writing of the research report of a project. A possible add-on to the original study is, for example, comparing EU and USA samples, or looking for education or age effects.
CREP uses the fantastic Open Science Framework (osf.io). This provides easy access to CREP’s main project page, hosting all material. It even has an inbuilt pre-print server. The entire project management can be done on the OSF page.
There is a range of studies currently looking for replicators. If you wonder how these studies were selected, here it is: the selected studies were the top three cited empirical articles in a range of psychology journals. These studies were then rated for feasibility by Dr. Mark Brandt and Dr. Hans IJzerman. However, it is also possible to suggest studies that are classic in your field of psychology and, crucially, are feasible to conduct for students at the bachelor level.
If you are interested, below are ongoing studies, and yes, you can become a replicator of these studies.
- Diener, E., Ng, W., Harter, J., & Arora, R. (2010). Wealth and happiness across the world: material prosperity predicts life evaluation, whereas psychosocial prosperity predicts positive feeling. Journal of Personality and Social Psychology, 99, 52. Study 1
- Griskevicius, V., Tybur, J. M., & Van den Bergh, B. (2010). Going green to be seen: Status, reputation, and conspicuous conservation. Journal of Personality and Social Psychology, 98, 392-404. Study 1
- Kool, W., McGuire, J. T., Rosen, Z. B., & Botvinick, M. M. (2010). Decision making and the avoidance of cognitive demand. Journal of Experimental Psychology. General, 139, 665. Study 3
- De Neys, W., Rossi, S., & Houdé, O. (2013). Bats, balls, and substitution sensitivity: Cognitive misers are no happy fools. Psychonomic Bulletin & Review, 20, 269-273. Study 1
- Tentori, K., Crupi, V., & Russo, S. (2013). On the determinants of the conjunction fallacy: Probability versus inductive confirmation. Journal of Experimental Psychology: General, 142, 235-255. Study 3
- Feng, S., D’Mello, S., & Graesser, A. C. (2013). Mind wandering while reading easy and difficult texts. Psychonomic Bulletin & Review, 20, 586-592. Study 1
With one master’s student, I am currently doing study 3: Kool et al. (2010). We went through the steps outlined below and are now collecting the data.
A brief “How To” guide
The first step is to contact CREP by emailing Jon Grahe (email@example.com) and ask to be added as a CREP contributor and refer to the OSF account and page you made for the project. Give a brief description of yourself (including educational background, home institution, and contact information), your supervisor (if any) and the study you have selected for replication, as well as the names of any peers you will be working with.
Second, you should specify whether you will do a DIRECT replication or whether you will add conditions or measures to the replication, making it a DIRECT +. To allow meta-analysis, any additional measures have to be AFTER the direct replication procedures and measures.
Third, use and build the OSF project page. Practically you will fork from the main page of the project. Forking means that all material in the “parent” or main page of the project will also be available in your OSF project page. You share the fork page with CREP so that we can verify that the material, procedures and, importantly, also the permission from your internal review board (IRB, see below) are in place. Sharing of the link is also evidence of your participation in the project. This OSF page lets you upload the translated material, your pre-registrationas well as all the raw data,processed data or data analysis scripts. It also provides a wiki – a perfect place to note meta-data, changes in exclusion criteria or exclusion of data due to technical issues, etc.
Fourth, as with most experimental studies, it is a good idea to have it evaluated by an internal ethics committee. Most universities in Norway (eg., UiO, UiT) have IRBs. As of May this year, you should also include a section on the GDPR (General Data Protection Regulation). I will not go into details here, but, briefly, you should clearly state which data will be made open access and how you meet the anonymity requirements. My own experience is that using randomly generated IDs and deleting the date of testing complies well with anonymity and open data.
The fifth step is filming video evidence of procedural mastery. Footage from a mobile phone camera is sufficient. That means you videotape how you instruct a participant (with English subtitles, if applicable). Surely, practice and pretesting are integral parts of all experimental studies and by us assessing your procedure, you get the confidence and skill needed for collecting high quality data.
Sixth, you collect data. We expect that you test at least as many participants as in the original study as the studies are selected so that this is feasible. Often, this means testing 30 to 50 participants for lab-based experiments and 70 to 100 for short online-based experiments. Everyone who contributed to data acquisition and the project have to sign the completion pledge document cited below and upload it on the OSF page.
«We certify that the data reported for this study were collected with approval of an Institutional Review Board following APA Ethical Guidelines, that the defined study protocol was followed to the best of our abilities, and any disruptions to that protocol for individual participants or for the data collection as a whole are disclosed in the «summary notes» report about the data collection.»
Another integral part of open science is to provide open data (yes, you can earn an «Open Data» Science Badge). This increases the value of your contribution.
Finally, you let us know if you presented your project at a conference. We encourage students (i.e. you) to take the lead on the manuscript for publication. We assist in project management and in the coordination of the different labs replicating the same study as you. Then we can collate the results and submit a paper to a peer reviewed journal. Interested? Visit our FAQ page https://osf.io/hocj2/wiki/home/
In sum: The CREP is a crowdsourced replication project for undergraduate researchers and you can be part of it!
Asendorpf, J. B., Conner, M. , De Fruyt, F. , De Houwer, J. , Denissen, J. J., Fiedler, K. , Fiedler, S. , Funder, D. C., Kliegl, R. , Nosek, B. A., Perugini, M. , Roberts, B. W., Schmitt, M. , Aken, M. A., Weber, H. and Wicherts, J. M. (2013), Recommendations for Increasing Replicability in Psychology. Eur. J. Pers., 27: 108-119. doi:10.1002/per.1919
Frank, M. C., & Saxe, R. (2012). Teaching Replication. Perspectives on Psychological Science, 7(6), 600-604. doi:10.1177/1745691612460686
Grahe, J. E., Reifman, A., Hermann, A. D., Walker, M., Oleson, K. C., Nario-Redmond, M., & Wiebe, R. P. (2012). Harnessing the Undiscovered Resource of Student Research Projects. Perspectives on Psychological Science, 7(6), 605-607. doi:10.1177/1745691612459057