Replication Studies

Special Education Research Accelerator (SERA)
Alexis Prijoles Alexis Prijoles

Special Education Research Accelerator (SERA)

Supported by the Institute of Education Sciences, we are working to develop a platform and resources that will allow researchers in special education conduct crowdsourced data collection or replication efforts across diverse settings and populations. Together with the UVA School of Data Science, we are working to address challenges for conducting replication studies in field settings, including assessing intervention fidelity and replicability across sites and studies, and collecting high quality data across multiple teams of researchers. Bryan Cook (Principal Investigator), Bill Therrien (Co-Principal Investigator), Vivian Wong (Co-Principal Investigator)

Read More
TeachSim for Improving the Preparation of New Teachers
Alexis Prijoles Alexis Prijoles

TeachSim for Improving the Preparation of New Teachers

Through support from the Jefferson Trust Foundation and the Bankard Foundation, the Collaboratory worked with TeachSim to conduct a series of six systematic replication studies that assesses the replicability of effects for an instructional coaching protocol on teacher candidates’ pedagogical skills. The Collaboratory is currently expanding its replication efforts across multiple sites and content areas with support from the Robertson Foundation and the National Science Foundation. Julie Cohen (Principal Investigator), Vivian Wong (Co-Principal Investigator) , Nathan Jones (Co-Principal Investigator)

Read More
Iterative Systematic Replication of Read Well in First Grade (IS2RW)
Alexis Prijoles Alexis Prijoles

Iterative Systematic Replication of Read Well in First Grade (IS2RW)

Supported by the Institute of Education Sciences, the Collaboratory is working with experts in reading and special education to implement an iterative systematic replication evaluation of Read Well (RW1). The design involves a series of planned and systematic conceptual replications to examine the effectiveness of RW1 for first graders with reading difficulties and disabilities, and the replicability of these effects over multiple, controlled sources of variation. The first series of replications employs a multi-site randomized control trial (RCT) to examine the replicability of effects across three sites with systematic differences in participant and setting characteristics by implementing RW1 under “ideal” controlled conditions (with researchers delivering the curriculum in small-group, Tier-2 settings). The second series of studies examines the replicability of effects across different implementation approaches of the intervention, with the research design depending upon findings from the first set of replication studies and advice from an external advisory board to guide the decision-making. This innovative approach to systematic replication of results will provide foundational evidence for identifying “what works” in RW1 and “for whom” and under “what conditions” does RW1 work. Emily Solari (Principal Investigator), Vivian Wong (Co-Principal Investigator), Doris Luft Baker (Co-Principal Investigator), Catherine Richards-Tutor (Co-Prinicipal Investigator)

Read More
Design Replication Studies for Evaluating Non-Experimental Methods
Alexis Prijoles Alexis Prijoles

Design Replication Studies for Evaluating Non-Experimental Methods

Supported by the National Science Foundation, the team introduced assumptions required for design replication studies to yield valid and interpretable results, as well as analytic methods for assessing correspondence in replication results. In design replication studies, treatment effects from a randomized experiment are compared to those obtained from a non-experimental approach that shares the same target population, outcome, and intervention. The non-experiment may by a regression-discontinuity design, an interrupted time series design, or a matching designs. The goals of the design replication are to determine whether the non-experiment can replicate results from a high-quality randomized experiment (which provides the causal benchmark estimate), and the contexts and conditions under which these methods work or do not work in practice. In this project, we derived design replication as a formal research design, as well as discussed analysis methods for assessing the replication of results. The team is currently conducting a meta-analysis of all design replication results to evaluate non-experimental method performance across multiple study settings, contexts, and conditions. Vivian Wong (Principal Investigator), Peter Steiner (Co-Principal Investigator)

Read More