blog posts and news stories

EIR 2023 Proposals Have Been Reviewed and Awards Granted

While children everywhere are excited about winter break and presents in their stockings, some of us in the education space look forward to December for other reasons. That’s right, the Department of Education just announced the EIR grant winners from the summer 2023 proposal submissions. We want to congratulation all our friends who were amongst that winning list.

One of those winning teams was made up of The MLK Sr Community Resources Center, Connect with Kids Network, Morehouseand Spelman Colleges, New York City Public Schools, The Urban Assembly, Atlanta Public Schools, and Empirical Education. We will evaluate the Sankofa Chronicles: SEL Curriculum from American Diasporas with the early-phase EIR development grant funding.

The word sankofa comes from the Twi language spoken by the Akan people of Ghana. The word is often associated with an Akan proverb, “Se wo were fi na wosankofa a yenkyi.” Translated into English this proverb reminds us, “It is not wrong to go back for that which you have forgotten.” Guided by the philosophy of sankofa, this five year grant will support the creation of a culturally-responsive, multimedia, social emotional learning (SEL) curriculum for high school students.

Participating students will be introduced to SEL concepts through short films that tell emotional and compelling stories of diverse diaspora within students’ local communities. These stories will be paired with an SEL curriculum that seeks to foster not only SEL skills (e.g., self-awareness, responsible decision making) but also empathy, cultural appreciation, and critical thinking.

Our part in the project will begin with a randomized control trial (RCT) of the curriculum in the 2025–2026 school year and culminate in an impact report following the RCT. We will continue to support the program through the remainder of the five-year grant with an implementation study and a focus on scaling up the program.

Check back for updates on this exciting project!

2023-12-07

Rock Island-Milan School District, Connect with Kids, and Empirical Education Win an EIR Grant

The Rock Island-Milan School District #41 (RIMSD) in partnership with CWK Network, Inc. (Connect with Kids) and Empirical Education just announced that they were awarded an EIR grant to develop and evaluate a program called How Are The Children? (HATC). This project-based social emotional curriculum intends to foster students’ social emotional competence, increase student engagement, and ameliorate the long-term social emotional impacts of the COVID-19 pandemic.

We will be conducting an independent evaluation of the effectiveness of HATC through a randomized control trial and formative evaluation. Our findings will inform the improvement of the program, as well as to foster the expansion of the curriculum into other schools and districts.

For more details on this grant and the project, see the press announcement and our EIR grant proposal.

2023-03-14

We Won a SEED Grant in 2022 with Georgia State University

Empirical Education began serving as a program evaluator of the teacher residency program, Collaboration and Reflection to Enhance Atlanta Teacher Effectiveness (CREATE), in 2015 under a subcontract with Atlanta Neighborhood Charter Schools (ANCS) as part of their Investing in Innovation (i3) Development grant. In 2018, we extended this work with CREATE and Georgia State University through the Supporting Effective Educator Development (SEED) Grant Program, through the U.S. Department of Education. In 2020, we were awarded additional SEED grants to further extend our work with CREATE.

Last month, in October 2022, we were notified that this important work will receive continued funding through SEED. CREATE has proposed the following goals with this continued funding.

  • Goal 1: Recruit, support, retain compassionate, skilled, anti-racist educators via residency
  • Goal 2: Design and enact transformative learning opportunities for experienced educators, teacher educators, and local stakeholders
  • Goal 3: Sustain effective and financially viable models for educator recruitment, support, and retention
  • Goal 4: Ensure all research efforts are designed to benefit partner organizations

Empirical remains deeply committed to designing and executing a rigorous and independent evaluation that will inform partner organizations, local stakeholders, and a national audience of the potential impact and replicability of a multifaceted program that centers equity and wellness for educators and students. With this new grant, we are also committed to integrating more mixed method approaches to better align our evaluation with CREATE’s antiracist mission, and to contribute to recent conversations about what it means to conduct educational effectiveness work with an equity and social justice orientation.

Using a quasi-experimental design and mixed-methods process evaluation, we aim to understand the impact of CREATE on teachers’ equitable and effective classroom practices, student achievement, and teacher retention. We will also explore key mediating impacts, such as teacher well-being and self-compassion, and conduct a cost-effectiveness and cost-benefit analysis. Importantly, we want to explore the cost-benefit CREATE offers to local stakeholders, centering this work in the Atlanta community. This funding allows us to extend our evaluation through CREATE’s 10th cohort of residents, and to continue exploring the impact of CREATE on Cooperating Teachers and experienced educators in Atlanta Public Schools.

2023-02-06

We Won Two SEED Grants in 2020

Empirical Education began conducting the evaluation of Collaboration and Reflection to Enhance Atlanta Teacher Effectiveness (CREATE) in 2015 under a subcontract with Atlanta Neighborhood Charter Schools (ANCS) as part of their Investing in Innovation (i3) Development grant. Then, in 2018, we extended this work with CREATE and Georgia State University through the Supporting Effective Educator Development (SEED) Grant Program. And now, in 2020, we were just notified, that BOTH proposals we submitted to the SEED competition to further extend our work with CREATE were awarded grants!

One of the SEED grants is an extension to the one we received in 2018 that will allow us to continue the project for two additional years (through years 4 and 5).

The other SEED award will fund new work with CREATE and  Georgia State University by adding additional cohorts of CREATE residents and conducting a quasi-experiment measuring the effectiveness of CREATE for Cooperating Teachers (that is, the mentor teachers in whose classrooms residents are placed).  The study will examine impacts on teacher effectiveness, teacher retention, and student achievement, as well as other mediating outcomes. 

2020-10-28

Empirical to Evaluate Two More Winning i3 Projects

U.S. Department of Education has announced the highest-rated applicants for the 2014 Investing in Innovation (i3) competition. Of the 434 submissions received by ED, we are excited that both of the proposals for which we developed evaluation plans were among the 26 winners. Both of these proposals were THE highest rated in their respective categories!

In one, we’ll be partnering with WestEd to evaluate the Making Sense of Science and Literacy program. Written as a validation proposal, this 5-year project will aim to strengthen teachers’ content knowledge, transform classroom practices, and boost student achievement.

The other highest-rated application was a development proposal submitted by the Atlanta Neighborhood Charter Schools. In this 5-year project, we will assess its 3-year residency model on the effectiveness of early career teachers.

Both projects were bid under the “priority” for teacher effectiveness. We have a long standing partnership with WestEd on i3 evaluations and Regional Lab projects. This is our first project with Atlanta Neighborhood Charter Schools, and it builds on our educator effectiveness work and our ongoing partnerships with charter schools, including our evaluation of an i3 Development effort by Aspire Public Schools.

For more information on our evaluation services and our work on i3 projects, please visit our i3 page and/or contact us.

2014-11-07

Empirical Education Presents Initial Results from i3 Validation Grant Evaluation

Our director of research operations, Jenna Zacamy, joined Cheri Fancsali from IMPAQ International and Cyndy Greenleaf from the Strategic Literacy Initiative (SLI) at WestEd at the Literacy Research Association (LRA) conference in Dallas, TX on December 4. Together, they conducted a symposium, which was the first formal presentation of findings from the Investing in Innovation (i3) Validation grant, Reading Apprenticeship Improving Secondary Education (RAISE). WestEd won the grant in 2010 with Empirical Education and IMPAQ serving as the evaluators. There are two ongoing evaluations: the first includes a randomized control trial (RCT) of over 40 schools in California and Pennsylvania investigating the impact of Reading Apprenticeship on teacher instructional practices and student achievement; the second is a formative evaluation spanning four states and 150+ schools investigating how the school systems build capacity to implement and disseminate Reading Apprenticeship and sustain these efforts. The symposium’s discussant, P. David Pearson (UC Berkeley), provided praise of the design and effort of both studies stating that he has “never seen such thoughtful and extensive evaluations.” Preliminary findings from the RCT show that Reading Apprenticeship teachers provide students more opportunities to practice metacognitive strategies and foster and support more student collaboration opportunities. Findings from the second year of the formative evaluation suggest high levels of buy-in and commitment from both school administrators and teachers, but also identify competing initiatives and priorities as a primary challenge to sustainability. Initial findings of our five-year, multi-state study of RAISE are promising, but reflect the real-world complexity of scaling up and evaluating literacy initiatives across several contexts. Final results from both studies will be available in 2015.

View the information presented at LRA here and here.

2013-12-19

Empirical Starts on a 3rd Investing in Innovation (i3) Evaluation

This week was the kickoff meeting in Oakland, CA for a multi-year evaluation of WestEd’s iRAISE project, a grant to develop an online training system for their Reading Apprenticeship framework. iRAISE stands for Internet-based Reading Apprenticeship Improving Science Education. Being developed by WestEd’s Strategic Literacy Initiative (SLI), a prominent R&D group in this domain, iRAISE will provide a 65-hour online version of their conventional face-to-face professional development for high school science teachers. We are also contracted for the evaluation of the validation-level i3 grant to WestEd for a scaling up of Reading Apprenticeship, a project that received the third highest score in that year’s i3 competition. Additionally, Empirical is conducting the evaluation of Aspire Public Schools development grant in 2011. In this case we are evaluating their teacher effectiveness technology tools.

Further information on our capabilities working with i3 grants is located here.

2013-03-22

Importance is Important for Rules of Evidence Proposed for ED Grant Programs

The U.S. Department of Education recently proposed new rules for including serious evaluations as part of its grant programs. The approach is modeled on how evaluations are used in the Investing in Innovation (i3) program where the proposal must show there’s some evidence that the proposed innovation has a chance of working and scaling and must include an evaluation that will add to a growing body of evidence about the innovation. We like this approach because it treats previous research as a hypothesis that the innovation may work in the new context. And each new grant is an opportunity to try the innovation in a new context, with improved approaches that warrant another check on effectiveness. But the proposed rules definitely had some weaknesses that were pointed out in the public comments available online. We hope ED heeds these suggestions.

Mark Schneiderman representing the Software and Information Industry Association (SIIA) recommends that outcomes used in effectiveness studies should not be limited to achievement scores.

SIIA notes that grant program resources could appropriately address a range of purposes from instructional to administrative, from assessment to professional development, and from data warehousing to systems productivity. The measures could therefore include such outcomes as student test scores, teacher retention rates, changes in classroom practice or efficiency, availability and use of data or other student/teacher/school outcomes, and cost effectiveness and efficiency that can be observed and measured. Many of these outcome measures can also be viewed as intermediate outcomes—changes in practice that, as demonstrated by other research, are likely to affect other final outcomes.

He also points out that quality of implementation and the nature of the comparison group can be the deciding factors in whether or not a program is found to be effective.

SIIA notes that in education there is seldom a pure control condition such as can be achieved in a medical trial with a placebo or sugar pill. Evaluations of education products and services resemble comparative effectiveness trials in which a new medication is tested against a currently approved one to determine whether it is significantly better. The same product may therefore prove effective in one district that currently has a weak program but relatively less effective in another where a strong program is in place. As a result, significant effects can often be difficult to discern.

This point gets to the heart of the contextual issues in any experimental evaluation. Without understanding the local conditions of the experiment the size of the impact for any other context cannot be anticipated. Some experimentalists would argue that a massive multi-site trial would allow averaging across many contextual variations. But such “on average” results won’t necessarily help the decision-maker working in specific local conditions. Thus, taking previous results as a rough indication that an innovation is worth trying is the first step before conducting the grant-funded evaluation of a new variation of the innovation under new conditions.

Jon Baron, writing for the Coalition for Evidence Based Policy expresses a fundamental concern about what counts as evidence. Jon, who is a former Chair of the National Board for Education Sciences and has been a prominent advocate for basing policy on rigorous research, suggests that

“the definition of ‘strong evidence of effectiveness’ in §77.1 incorporate the Investing in Innovation Fund’s (i3) requirement for effects that are ‘substantial and important’ and not just statistically significant.”

He cites examples where researchers have reported statistically significant results, which were based on trivial outcomes or had impacts so small as to have no practical value. Including “substantial and important” as additional criteria also captures the SIIA’s point that it is not sufficient to consider the internal validity of the study—policy makers must consider whether the measure used is an important one or whether the treatment-control contrast allows for detecting a substantial impact.

Addressing the substance and importance of the results gets us appropriately into questions of external validity, and leads us to questions about subgroup impact, where, for example, an innovation has a positive impact “on average” and works well for high scoring students but provides no value for low scoring students. We would argue that a positive average impact is not the most important part of the picture if the end result is an increase in a policy-relevant achievement gap. Should ED be providing grants for innovations where there has been a substantial indication that a gap is worsened? Probably yes, but only if the proposed development is aimed at fixing the malfunctioning innovation and if the program evaluation can address this differential impact.

2013-03-17

Study of Alabama STEM Initiative Finds Positive Impacts

On February 21, 2012 the U.S. Department of Education released the final report of an experiment that Empirical Education has been working on for the last six years. The report, titled Evaluation of the Effectiveness of the Alabama Math, Science, and Technology Initiative (AMSTI) is now available on the Institute of Education Sciences website. The Alabama State Department of Education held a press conference to announce the findings, attended by Superintendent of Education Bice, staff of AMSTI, along with educators, students, and co-principal investigator of the study, Denis Newman, CEO of Empirical Education.

AMSTI was developed by the state of Alabama and introduced in 2002 with the goal of improving mathematics and science achievement in the state’s K-12 schools. Empirical Education was primarily responsible for conducting the study—including the design, data collection, analysis, and reporting—under its subcontract with the Regional Education Lab, Southeast (the study was initiated through a research grant to Empirical). Researchers from Academy of Education Development, Abt Associates, and ANALYTICA made important contributions to design, analysis and data collection.

The findings show that after one year, students in the 41 AMSTI schools experienced an impact on mathematics achievement equivalent to 28 days of additional student progress over students receiving conventional mathematics instruction. The study found, after one year, no difference for science achievement. It also found that AMSTI had an impact on teachers’ active learning classroom practices in math and science that, according to the theory of action posited by AMSTI, should have an impact on achievement. Further exploratory analysis found effects for student achievement in both mathematics and science after two years. The study also explored reading achievement, where it found significant differences between the AMSTI and control groups after one year. Exploration of differential effect for student demographic categories found consistent results for gender, socio-economic status, and pretest achievement level for math and science. For reading, however, the breakdown by student ethnicity suggests a differential benefit.

Just about everybody at Empirical worked on this project at one point or another. Besides the three of us (Newman, Jaciw and Zacamy) who are listed among the authors, we want to acknowledge past and current employees whose efforts made the project possible: Jessica Cabalo, Ruthie Chang, Zach Chin, Huan Cung, Dan Ho, Akiko Lipton, Boya Ma, Robin Means, Gloria Miller, Bob Smith, Laurel Sterling, Qingfeng Zhao, Xiaohui Zheng, and Margit Zsolnay.

With solid cooperation of the state’s Department of Education and the AMSTI team, approximately 780 teachers and 30,000 upper-elementary and middle school students in 82 schools from five regions in Alabama participated in the study. The schools were randomized into one of two categories: 1) Those who received AMSTI starting the first year, or 2) Those who received “business as usual” the first year and began participation in AMSTI the second year. With only a one-year delay before the control group entered treatment, the two-year impact was estimated using statistical techniques developed by, and with the assistance of our colleagues at Abt Associates. Academy for Education Development assisted with data collection and analysis of training and program implementation.

Findings of the AMSTI study will also be presented at the Society for Research on Educational Effectiveness (SREE) Spring Conference taking place in Washington D.C. from March 8-10, 2012. Join Denis Newman, Andrew Jaciw, and Boya Ma on Friday March 9, 2012 from 3:00pm-4:30pm, when they will present findings of their study titled, “Locating Differential Effectiveness of a STEM Initiative through Exploration of Moderators.” A symposium on the study, including the major study collaborators, will be presented at the annual conference of the American Educational Research Association (AERA) on April 15, 2012 from 2:15pm-3:45pm at the Marriott Pinnacle ⁄ Pinnacle III in Vancouver, Canada. This session will be chaired by Ludy van Broekhuizen (director of REL-SE) and will include presentations by Steve Ricks (director of AMSTI); Jean Scott (SERVE Center at UNCG); Denis Newman, Andrew Jaciw, Boya Ma, and Jenna Zacamy (Empirical Education); Steve Bell (Abt Associates); and Laura Gould (formerly of AED). Sean Reardon (Stanford) will serve as the discussant. A synopsis of the study will also be included in the Common Guidelines for Education Research and Development.

2012-02-21

Empirical is participating in recently awarded five-year REL contracts

The Institute of Education Sciences (IES) at the U.S. Department of Education recently announced the recipients of five-year contracts for each of the 10 Regional Education Laboratories (RELs). We are excited to be part of four strong teams of practitioners and researchers that received the awards.

The original request for proposals in May 2011 called for the new RELs to work closely with alliances of state and local education agencies and other practitioner organizations to build local capacity for research. Considering the close ties between this agenda and Empirical’s core mission we joined the proposal efforts and are now part of winning teams in the West (led by WestEd), Northwest (led by Education Northwest), Midwest (led by the American Institutes for Research (AIR)), and Southwest (led by SEDL) The REL Southwest is currently under a stop work order while ED addresses a dispute concerning its review process. Empirical Education’s history in conducting Randomized Control Trials (RCTs) and in providing technical assistance to education agencies provides a strong foundation for the next five years.

2012-02-16
Archive