blog posts and news stories

Sure, the edtech product is proven to work, but will it work in my district?

It’s a scenario not uncommon in your district administrators’ office. They’ve received sales pitches and demos of a slew of new education technology (edtech) products, each one accompanied with “evidence” of its general benefits for teachers and students. But underlying the administrator’s decision is a question often left unanswered: Will this work in our district?

In the conventional approach to research advocated, for example, by the U.S. Department of Education and the Every Student Succeeds Act (ESSA), the finding that is reported and used in the review of products is the overall average impact for any and all subgroups of students, teachers, or schools in the study sample. In our own research, we have repeatedly seen that who it works for and under what conditions can be more important than the average impact. There are products that are effective on average but don’t work for an important subgroup of students, or vice versa, work for some students but not all. Some examples:

  • A math product, while found to be effective overall, was effective for white students but ineffective for minority students. This effect would be relevant to any district wanting to close (rather than further widen) an achievement gap.
  • A product that did well on average performed very well in elementary grades but poorly in middle school. This has obvious relevance for a district, as well as for the provider who may modify its marketing target.
  • A teacher PD product greatly benefitted uncertified teachers but didn’t help the veteran teachers do any better than their peers using the conventional textbook. This product may be useful for new teachers but a poor choice for others.

As a research organization, we have been looking at ways to efficiently answer these kinds of questions for products. Especially now, with the evidence requirements built into ESSA, school leaders can ask the edtech salesperson: “Does your product have evidence that ESSA calls for?” They may well hear an affirmative answer supported by an executive summary of a recent study. But, there’s a fundamental problem with what ESSA is asking for. ESSA doesn’t ask for evidence that the product is likely to work in your specific district. This is not the fault of ESSA’s drafters. The problem is built into the conventional design of research on “what works”. The U.S. Department of Education’s What Works Clearinghouse (WWC) bases its evidence rating only on an average; if there are different results for different subgroups of students, that difference is not part of the rating. Since ESSA adopts the WWC approach, that’s the law of the land. Hence, your district’s most pressing question is left unanswered: will this work for a district like mine?

Recently, the Software & Information Industry Association, the primary trade association of the software industry, released a set of guidelines for research explaining to its member companies the importance of working with districts to conduct research that will meet the ESSA standards. As the lead author of this report, I can say it was our goal to foster an improved dialog between the schools and the providers about the evidence that should be available to support buying these products. As an addendum to the guidelines aimed at arming educators with ways to look at the evidence and questions to ask the edtech salesperson, here are three suggestions:

  1. It is better to have some information than no information. The fact that there’s research that found the product worked somewhere gives you a working hypothesis that it could be a better than average bet to try out in your district. In this respect, you can consider the WWC and newer sites such as Evidence for ESSA rating of the study as a screening tool—they will point you to valid studies about the product you’re interested in. But you should treat previous research as a working hypothesis rather than proof.
  2. Look at where the research evidence was collected. You’ll want to know whether the research sites and populations in the study resemble your local conditions. WWC has gone to considerable effort to code the research by the population in the study and provides a search tool so you can find studies conducted in districts like yours. And if you download and read the original report, it may tell you whether it will help reduce or increase an achievement gap of concern.
  3. Make a deal with the salesperson. In exchange for your help in organizing a pilot and allowing them to analyze your data, you get the product for a year at a steep discount and a good ongoing price if you decide to implement the product on a full scale. While you’re unlikely to get results from a pilot (e.g., based on spring testing) in time to support a decision, you can at least lower your cost for the materials, and you’ll help provide a neighboring district (with similar populations and conditions) with useful evidence to support a strong working hypothesis as to whether it is likely to work for them as well.
2017-10-15

National Forum to Advance Rural Education 2017


We are participating in 2 discussions at the National Forum to Advance Rural Education, organized by Battelle for Kids on Thursday, October 12, 2017.

THURSDAY, OCTOBER 12 | 1:15–2:15pm
Quality Teachers in Rural Schools: Lessons Learned in Oklahoma
Join a discussion with the Regional Educational Laboratory Southwest (REL Southwest) and practitioners in the Oklahoma Rural Schools Research Alliance about their research focused on two areas of high need in rural schools: teacher recruitment and retention, and professional development. This informal discussion with the researchers and Oklahoma practitioners will focus on how you can use the information from these studies in your own state and school district.
Presenters:
Pia Peltola (REL Southwest, American Institutes for Research)
Susan Pinson (Oklahoma State Department of Education)
Kathren Stehno (Office of Educational Quality & Accountability)
Megan Toby (Empirical Education)
Haidee Williams (REL Southwest, American Institutes for Research)

Rosa Ailbouni Room, Third Floor

THURSDAY, OCTOBER 12 | 2:30–3pm
Recruiting and Retaining Quality Teachers in Oklahoma
Learn about research conducted in partnership with the Regional Educational Laboratory Southwest (REL Southwest) and practitioners in the Oklahoma Rural Schools Research Alliance. The research identified teacher, district, and community characteristics that are predictors of successful teacher recruitment and retention in rural Oklahoma which can inform future policy and practice. Join the researchers and alliance members who guided the research and discover how you can use the information in your school district.
Presenters:
Kathren Stehno (Office of Educational Quality & Accountability)
Megan Toby (Empirical Education)
Haidee Williams (REL Southwest, American Institutes for Research)

Great Hall Meeting Room 2, First Floor


2017-10-04

Join Our Webinar: Measuring Ed Tech impact in the ESSA Era

Tuesday, November 7, 2017 … 2:00 - 3:00pm PT

Our CEO, Denis Newman, will be collaborating with Andrew Coulson (Chief Strategist, MIND Research Institute) and Bridget Foster (Senior VP and Managing Director, SIIA) to bring you an informative webinar next month!

This free webinar (Co-hosted by edWeb.net and MCH Strategic Data) will introduce you to a new approach to evidence about which edtech products really work in K-12 schools. ESSA has changed the game when it comes to what counts as evidence. This webinar builds on the Education Technology Industry Network’s (ETIN) recent publication of Guidelines for EdTech Impact Research that explains the new ground rules.

The presentation will explore how we can improve the conversation between edtech developers and vendors (providers), and the school district decision makers who are buying and/or piloting the products (buyers). ESSA has provided a more user-friendly definition of evidence, which facilitates the conversation.

  • Many buyers are asking providers if there’s reason to think their product is likely to work in a district like theirs.
  • For providers, the new ESSA rules let them start with simple studies to show their product shows promise without having to invest in expensive trials to prove it will work everywhere.

The presentation brings together two experts: Andrew Coulson, a developer who has conducted research on their products and is concerned with improving the efficacy of edtech, and Denis Newman, a researcher who is the lead author of the ETIN Guidelines. The presentation will be moderated by Bridget Foster, a long-time educator who now directs the ETIN at SIIA. This edWebinar will be of interest to edtech developers, school and district administrators, education policy makers, association leaders, and any educator interested in the evidence of efficacy in edtech.

If you would like to attend, click here to register.

2017-09-28

We Are Participating in the Upcoming REL Webinar on Teacher Mobility

Join Regional Educational Laboratories Midwest and Southwest for a free webinar on October 4 to learn how states can address teacher demand and mobility trends. As a partner in REL Southwest, we will be reporting on our work on teacher recruitment and retention in rural Oklahoma.

Teachers and administrators change schools for a variety of reasons. Mobility can be a positive if an educator moves to a position that is a better fit, but it can also have serious implications for states. Mobility may harm schools that serve high-need populations, and mobility can also create additional recruitment and hiring costs for districts.

This webinar focuses on research addressing the teacher pipeline and the mobility of teachers between schools and districts. Presenters will discuss two published REL Midwest research studies on teacher mobility trends and strategies for estimating teacher supply and demand. Following each presentation, leaders from the Oklahoma State Department of Education and the Minnesota Department of Education will respond to the presentations and share state initiatives to meet teacher staffing needs. Presenters also will briefly highlight two upcoming REL Southwest studies related to teacher supply and demand that are expected to be released later this year.

The studies that will be discussed are:
- An examination of the movement of educators within and across three Midwest Region states (REL Midwest, AIR)
- Strategies for estimating teacher supply and demand using student and teacher data (REL Midwest, AIR)
- Indicators of successful teacher recruitment and retention in Oklahoma rural schools (REL Southwest, Empirical Education)
- Teacher mobility in Texas: Trends and associations with student, teacher, and school characteristics (REL Southwest, AIR, Empirical Education)

This webinar is designed for state education staff, administrators in schools and districts with significant American Indian populations, American Indian community leaders, research alliance and community of practice members, and education researchers. If you cannot attend the live event, register at the link below to be notified when a recording of the webinar is available online.

Exploring Educator Movement Between Districts
October 4, 2017
10:00–11:30 a.m. PT

The Regional Educational Laboratories (RELs) build the capacity of educators to use data and research to improve student outcomes. Each REL responds to needs identified in its region and makes learning opportunities and other resources available to educators throughout the United States. The REL program is a part of the Institute of Education Sciences (IES) in the U.S. Department of Education. To receive regular updates on REL work, including events and reports, follow IES on Facebook and Twitter.

You can register for this event on the REL website.

2017-09-21

Partnering with SRI and CAST on an RCT

Empirical Education and CAST are excited to announce a new partnership under an Investing in Innovation (i3) grant.

We’ll evaluate the Enhanced Units program, which was written as a development proposal by SRI and CAST. This project will aim to integrate content enhancement routines and learning and collaboration strategies, enhancements to improve student content learning, higher order reasoning, and collaboration.

We will conduct the experiment within up to three school districts in California and Virginia—working with teachers of high school science and social studies students. This is our first project with CAST, and it builds on our extensive experience conducting large-scale, rigorous, experimental impact studies, as well as formative and process evaluations.

For more information on our evaluation services and our work on i3 projects, please visit our i3 /EIR page and/or contact us.

2017-07-27

ETIN Releases Guidelines for Research on Educational Technologies in K-12 Schools

The press release (below) was originally published on the SIIA website. It has since inspired stories in the Huffington Post, edscoop, and EdWeek’s Market Brief



ETIN Releases Guidelines for Research on Educational Technologies in K-12 Schools

Changes in education technology and policy spur updated approach to industry research

Washington, DC (July 25, 2017)The Education Technology Industry Network, a division of The Software & Information Industry Association, released an important new report today: “Guidelines for Conducting and Reporting EdTech Impact Research in U.S. K-12 Schools.” Authored by Dr. Denis Newman and the research team at Empirical Education Inc., the Guidelines provide 16 best practice standards of research for publishers and developers of educational technologies.

The Guidelines are a response to the changing research methods and policies driven by the accelerating pace of development and passage of the Every Student Succeeds Act (ESSA), which has challenged the static notion of evidence defined in NCLB. Recognizing the need for consensus among edtech providers, customers in the K-12 school market, and policy makers at all levels, SIIA is making these Guidelines freely available.

“SIIA members recognize that changes in technology and policy have made evidence of impact an increasingly critical differentiator in the marketplace,” said Bridget Foster, senior VP and managing director of ETIN. “The Guidelines show how research can be conducted and reported within a short timeframe and still contribute to continuous product improvement.”

“The Guidelines for research on edtech products is consistent with our approach to efficacy: that evidence of impact can lead to product improvement,” said Amar Kumar, senior vice president of Efficacy & Research at Pearson. “We appreciate ETIN’s leadership and Empirical Education’s efforts in putting together this clear presentation of how to use rigorous and relevant research to drive growth in the market.”

The Guidelines draw on over a decade of experience in conducting research in the context of the U.S. Department of Education’s Institute of Education Sciences, and its Investing in Innovation program.

“The current technology and policy environment provides an opportunity to transform how research is done,” said Dr. Newman, CEO of Empirical Education Inc. and lead author of the Guidelines. “Our goal in developing the new guidelines was to clarify current requirements in a way that will help edtech companies provide school districts with the evidence they need to consistently quantify the value of software tools. My thanks go to SIIA and the highly esteemed panel of reviewers whose contribution helped us provide the roadmap for the change that is needed.”

“In light of the ESSA evidence standards and the larger movement toward evidence-based reform, publishers and software developers are increasingly being called upon to show evidence that their products make a difference with children,” said Guidelines peer reviewer Dr. Robert Slavin, director of the Center for Research and Reform in Education, Johns Hopkins University. “The ETIN Guidelines provide practical, sensible guidance to those who are ready to meet these demands.”

ETIN’s goal is to improve the market for edtech products by advocating for greater transparency in reporting research findings. For that reason, it is actively working with government, policy organizations, foundations, and universities to gain the needed consensus for change.

“As digital instructional materials flood the market place, state and local leaders need access to evidence-based research regarding the effectiveness of products and services. This guide is a great step in supporting both the public and private sector to help ensure students and teachers have access to the most effective resources for learning,” stated Christine Fox, Deputy Executive Director, SETDA. The Guidelines can be downloaded here: https://www.empiricaleducation.com/research-guidelines.

2017-07-25

SIIA ETIN EIS Conference Presentations 2017


We are playing a major role in the Education Impact Symposium (EIS), organized by the Education Technology Industry Network (ETIN), a division of The Software & Information Industry Association (SIIA).

  1. ETIN is releasing a set of edtech research guidelines that CEO Denis Newman wrote this year
  2. Denis is speaking on 2 panels this year

The edtech research guidelines that Denis authored and ETIN is releasing on Tuesday, July 25 are called “Guidelines for Conducting and Reporting EdTech Impact Research in U.S. K-12 Schools” and can be downloaded from this webpage. The Guidelines are a much-needed response to a rapidly-changing environment of cloud-based technology and important policy changes brought about by the Every Student Succeeds Act (ESSA).

The panels Denis will be presenting on are both on Tuesday, July 25, 2017.

12:30 - 1:15pm
ETIN’s New Guidelines for Product Research in the ESSA Era
With the recent release of ETIN’s updated Guidelines for EdTech Impact Research, developers and publishers can ride the wave of change from NCLB’s sluggish concept of “scientifically-based” to ESSA’s dynamic view of “evidence” for continuous improvement. The Guidelines are being made publicly available at the Symposium, with a discussion and Q&A led by the lead author and some of the contributing reviewers.
Moderator:
Myron Cizdyn, Chief Executive Officer, The BLPS Group
Panelists:
Malvika Bhagwat, Research & Efficacy, Newsela
Amar Kumar, Sr. Vice President, Pearson
Denis Newman, CEO, Empirical Education Inc.
John Richards, President, Consulting Services for Education

2:30 - 3:30pm
The Many Faces of Impact
Key stakeholders in the EdTech Community will each review in Ted Talk style, what they are doing to increase impact of digital products, programs and services. Our line-up of presenters include:
- K-12 and HE content providers using impact data to better understand their customers improve their products, and support their marketing and sales teams
- an investor seeking impact on both disadvantaged populations and their financial return in order to make funding decisions for portfolio companies
- an education organization helping institutions decide what research is useful to them and how to grapple with new ESSA requirements
- a researcher working with product developers to produce evidence of the impact of their digital products

After the set of presenters have finished, we’ll have time for your questions on these multidimensional aspects of IMPACT and how technology can help.
Moderator:
Karen Billings, Principal, BillingsConnects
Panelists:
Jennifer Carolan, General Partner, Reach Capital
Christopher Cummings, VP, Institutional Product and Solution Design, Cengage
Melissa Greene, Director, Strategic Partnerships, SETDA
Denis Newman, CEO, Empirical Education Inc.
Kari Stubbs, PhD, Vice President, Learning & Innovation, BrainPOP

Jennifer Carolan, Denis Newman, and Chris Cummings on a panel at ETIN EIS

If you get a chance to check out the Guidelines before EIS, Denis would love to hear your thoughts about them at the conference.

2017-07-21

Determining the Impact of MSS on Science Achievement

Empirical Education is conducting an evaluation of Making Sense of SCIENCE (MSS) under an Investing in Innovation (i3) five-year validation grant awarded in 2014. MSS is a teacher professional learning approach that focuses on science understanding, classroom practice, literacy support, and pedagogical reasoning. The primary purpose of the evaluation is to assess the impact of MSS on teachers’ science content knowledge and student science achievement and attitudes toward science. The evaluation takes place in 66 schools across two geographic regions—Wisconsin and the Central Valley of California. Participating Local Educational Agencies (LEAs) include: Milwaukee Public Schools (WI), Racine Unified School District (WI), Lodi Unified School District (CA), Manteca Unified School District (CA), Turlock Unified School District (CA), Stockton Unified School District (CA), Sylvan Unified School District (CA), and the San Joaquin County Office of Education (CA).

Using a Randomized Control Trial (RCT) design, in 2015-16, we randomly assigned the schools (32 in Wisconsin and 34 in California) to receive the MSS intervention or continue with business-as-usual district professional learning and science instruction. Professional learning activities and program implementation take place during the 2016-17 and 2017-18 school years, with delayed treatment for the schools randomized to control, planned for 2018-19 and 2019-20.

Confirmatory impacts on student achievement and teacher content knowledge will be assessed in 2018. Confirmatory research questions include:

What is the impact of MSS at the school-level, after two years of full implementation, on science achievement in Earth and physical science among 4th and 5th grade students in intervention schools, compared to 4th and 5th grade students in control schools receiving the business-as-usual science instruction?


What is the impact of MSS on science achievement among low-achieving students in intervention elementary schools with two years of exposure to MSS (in grades 4-5) compared to low-achieving students in control elementary schools with business-as-usual instruction for two years (in grades 4-5)?

What is the impact of MSS on teachers’ science content knowledge in Earth and physical science compared to teachers in the business-as-usual control schools, after two full years of implementation in schools?

Additional exploratory analyses are currently being conducted and will continue through 2018. Exploratory research questions examine the impact of MSS on students’ ability to communicate science ideas in writing, as well as non-academic outcomes, such as confidence and engagement in learning science. We will also explore several teacher-level outcomes, including teachers’ pedagogical science content knowledge, and changes in classroom instructional practices. The evaluation also includes measures of fidelity of implementation.

We plan to publish the final results of this study in fall of 2019. Please check back to read the research summary and report.

2017-06-19

Determining the Impact of CREATE on Math and ELA Achievement

Empirical Education is conducting the evaluation of Collaboration and Reflection to Enhance Atlanta Teacher Effectiveness (CREATE) under an Investing in Innovation (i3) development grant awarded in 2014. The CREATE evaluation takes place in schools throughout the state of Georgia.

Approximately 40 residents from the Georgia State University (GSU) College of Education (COE) are participating in the CREATE teacher residency program. Using a quasi-experimental design, outcomes for these teachers and their students will be compared to those from a matched comparison group of close to 100 teachers who simultaneously enrolled in GSU COE but did not participate in CREATE. Implementation for cohort 1 started in 2015, and cohort 2 started in 2016. Confirmatory outcomes will be assessed in years 2 and 3 of both cohorts (2017 - 2019).

Confirmatory research questions we will be answering include:

What is the impact of one-year of exposure of students to a novice teacher in their second year of teacher residency in the CREATE program, compared to the Business as Usual GSU teacher credential program, on mathematics and ELA achievement of students in grades 4-8, as measured by the Georgia Milestones Assessment System?

What is the impact of CREATE on the quality of instructional strategies used by teachers, as measured by the Teacher Assessment of Performance Standards (TAPS) scores, at the end of the third year of residency, relative to the business as usual condition?

What is the impact of CREATE on the quality of the learning environment created by teachers, as measured by Teacher Assessment of Performance Standards (TAPS) scores, at the end of the third year of residency, relative to the business as usual condition?

Exploratory research questions will address additional teacher-level outcomes including retention, effectiveness, satisfaction, collaboration, and levels of stress in relationships with students and colleagues.

We plan to publish the results of this study in fall of 2019. Please visit the CREATE webpage to read the research report.

2017-06-06

Academic Researchers Struggle with Research that is Too Expensive and Takes Too Long

I was in DC for an interesting meeting a couple weeks ago. The “EdTech Efficacy Research Academic Symposium” was very much an academic symposium.

The Jefferson Education Accelerator—out of the University of Virginia school of education—and Digital Promise—an organization that invents ways for school districts to make interesting use of edtech products and concepts—sponsored the get together. About 32% of the approximately 260 invited attendees were from universities or research organizations that conduct academic style research. About 16% represented funding or investment organizations and agencies, and another 20% were from companies that produce edtech (often being funded by the funders). 6% were school practitioners and, as would be expected at a DC event, about 26% were from associations and the media.

I represented a research organization with a lot of experience evaluating commercial edtech products. While in the midst of writing research guidelines for the software industry, i.e., the Software & Information Industry Association (SIIA), I felt a bit like an anthropologist among the predominantly academic crowd. I was listening to the language and trying to discern thinking patterns of professors and researchers, both federally- and foundation-funded. A fundamental belief is that real efficacy research is expensive (in the millions of dollars) and slow (a minimum of several years for a research report). A few voices said the cost could be lowered, especially for a school-district-initiated pilot, but the going rate—according to discussions at the meeting—for a simple study starts at $250,000. Given a recent estimate of 4,000 edtech products, (and assuming that new products and versions of existing products are being released at an accelerating rate), the annual cost of evaluating all edtech products would be around $1 billion—an amount unlikely to be supported in the current school funding climate.

Does efficacy research need to be that expensive and slow given the widespread data collection by schools, widely available datasets, and powerful computing capabilities? Academic research is expensive for several reasons. There is little incentive for research providers to lower costs. Federal agencies offer large contracts to attract the large research organizations with experience and high overhead rates. Other funders are willing to pay top dollar for the prestige of such organizations. University grant writers aim to support a whole scientific research program and need to support grad students and generally conduct unique studies that will be attractive to journals. In conventional practice, each study is a custom product. Automating repeatable processes is not part of the culture. Actually, there is an odd culture clash between the academic researchers and the edtech companies needing their services.

Empirical Education is now working with Reach Capital and their portfolio to develop an approach for edtech companies and their investors to get low-cost evidence of efficacy. We are also getting our recommendations down in the form of guidelines for edtech companies to get usable evidence. The document is expected to be released at SIIA’s Education Impact Symposium in July.

2017-05-30
Archive