blog posts and news stories

Empirical Education Publication Productivity


Empirical Education’s research group, led by Chief Scientist Andrew Jaciw, has been busy publishing articles that address key concerns of educators and researchers.

Our article describing the efficacy trial of Math in Focus program that was accepted by JREE earlier this year just arrived in print to our Palo Alto office a couple of weeks ago. If you subscribe to JREE, it’s the very first article in the current issue (volume 9, number 4). If you don’t subscribe, we have a copy in our lobby if anyone would like to stop by and check it out.

Another article that the analysis team has been working on is called “An Empirical Study of Design Parameters for Assessing Differential Impacts for Students in Group Randomized Trials.” This one has recently been accepted for publication in the Evaluation Review in the issue that should be printed any day now. The paper grows out of our work on many cluster randomized trials and our interest in differential impacts of programs. We believe that the question of “what works” has limited meaning without systematic exploration of “for whom” and “under what conditions”. The common perception is that these latter concerns are secondary and our designs have too little power to assess them. We challenge these notions and provide guidelines for addressing these questions.

In another issue of Evaluation Review, we published two companion articles:

Assessing the Accuracy of Generalized Inferences From Comparison Group Studies Using a Within-Study Comparison Approach: The Methodology


Applications of a Within-Study Comparison Approach for Evaluating Bias in Generalized Causal Inferences from Comparison Groups Studies

This work further extends our interest in issues of external validity and equip researchers with a strategy for testing the limits of generalizations from randomized trials. Written for a technical audience, the work extends an approach commonly used to assess levels of selection bias in estimates from non-experimental studies to examine bias in generalized inferences from experiments and non-experiments.

It’s always exciting for our team to share the findings from our experiments, as well as the things we learn during the analysis that can help the evaluation community provide more productive evidence for educators. Much of our work is done in partnership with other organizations and if you’re interested in partnering with us on this kind of work, please email us.

2016-11-18

Math in Focus Paper Published in JREE

Chief Scientist Andrew Jaciw’s paper entitled Assessing Impacts of Math in Focus, a “Singapore Math” Program was accepted by the Journal of Research on Educational Effectiveness. The paper reports the results of an RCT conducted in Clark County (Las Vegas, NV) by a team that included Whitney Hegseth, Li Lin, Megan Toby, Denis Newman, Boya Ma, and Jenna Zacamy. From the abstract (available online here):

Twenty-two grade-level teams across twelve schools were randomized to the program or business as usual. Measures included indicators of fidelity to treatment, and student mathematics learning. Impacts on mathematics achievement ranged between .11 and .15 standard deviation units, with no differential impact based on level of pretest [or] minority status.

2016-03-30
Archive