blog posts and news stories

New Article Published on the Processes Involved with Scaling-Up or Abandoning an Innovation

Our study of scaling up an innovation that challenges conventional approaches to research is being published in the Peabody Journal of Education and is now available online at Taylor & Francis

The article, “School Processes That Can Drive Scaling-Up of an Innovation or Contribute to Its Abandonment”, looks at the drivers of school-level processes that predict the growth or the attrition of a school’s team implementing an innovation. We looked for the factors that helped to explain the school-level success or failure of a high school academic literacy framework, Reading Apprenticeship, developed by WestEd’s Strategic Literacy Initiative (SLI). The work was funded by an i3 validation grant on which we were independent evaluators. SLI had an innovative strategy for scaling-up, involving school-based cross-disciplinary teacher teams, and brought the framework to 274 schools across five states. This strategy follows research literature that views scale-up as increasing local ownership and depth of commitment. In this study, we show that there are factors working both for and against the increase of teachers and schools joining and staying in an innovation. Given wide variation in teacher uptake, we can identify processes present in the initial year that predicted gains and losses of participants.

Clicking on this link will allow you to read the abstract (and the full article if you subscribe to the journal). If you don’t already subscribe, but you would like to read the article, send us an email, and we will share with you a link that will grant you a free download of the article.

2017-10-20

Empirical Education Publication Productivity


Empirical Education’s research group, led by Chief Scientist Andrew Jaciw, has been busy publishing articles that address key concerns of educators and researchers.

Our article describing the efficacy trial of Math in Focus program that was accepted by JREE earlier this year just arrived in print to our Palo Alto office a couple of weeks ago. If you subscribe to JREE, it’s the very first article in the current issue (volume 9, number 4). If you don’t subscribe, we have a copy in our lobby if anyone would like to stop by and check it out.

Another article that the analysis team has been working on is called “An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials.” This one has recently been accepted for publication in the Evaluation Review in the issue that should be printed any day now. The paper grows out of our work on many cluster randomized trials and our interest in differential impacts of programs. We believe that the question of “what works” has limited meaning without systematic exploration of “for whom” and “under what conditions”. The common perception is that these latter concerns are secondary and our designs have too little power to assess them. We challenge these notions and provide guidelines for addressing these questions.

In another issue of Evaluation Review, we published two companion articles:

Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach: The Methodology


Applications of a Within-Study Comparison Approach for Evaluating Bias in Generalized Causal Inferences from Comparison Groups Studies

This work further extends our interest in issues of external validity and equip researchers with a strategy for testing the limits of generalizations from randomized trials. Written for a technical audience, the work extends an approach commonly used to assess levels of selection bias in estimates from non-experimental studies to examine bias in generalized inferences from experiments and non-experiments.

It’s always exciting for our team to share the findings from our experiments, as well as the things we learn during the analysis that can help the evaluation community provide more productive evidence for educators. Much of our work is done in partnership with other organizations and if you’re interested in partnering with us on this kind of work, please email us.

2016-11-18

Math in Focus Paper Published in JREE

Chief Scientist Andrew Jaciw’s paper entitled Assessing Impacts of Math in Focus, a “Singapore Math” Program was accepted by the Journal of Research on Educational Effectiveness. The paper reports the results of an RCT conducted in Clark County (Las Vegas, NV) by a team that included Whitney Hegseth, Li Lin, Megan Toby, Denis Newman, Boya Ma, and Jenna Zacamy. From the abstract (available online here):

Twenty-two grade-level teams across twelve schools were randomized to the program or business as usual. Measures included indicators of fidelity to treatment, and student mathematics learning. Impacts on mathematics achievement ranged between .11 and .15 standard deviation units, with no differential impact based on level of pretest [or] minority status.

2016-03-30
Archive