blog posts and news stories

Where's Denis?

It’s been a busy month for Empirical CEO Denis Newman, who’s been in absentia at our Palo Alto office as he jet-sets around the country to spread the good word of rigorous evidence in education research.

His first stop was Washington, DC and the conference of the Society for Research on Educational Effectiveness (SREE). This was an opportunity to get together with collaborators, as well as plot proposal writing, blog postings, webinars, and revisions to our research guidelines for edtech impact studies. Andrew Jaciw, Empirical’s Chief Scientist, kept up the company’s methodological reputation with a paper presentation on “Leveraging Fidelity Data to Make Sense of Impact Results.” For Denis, a highlight was dinner with Peg Griffin, a longtime friend and his co-author on The Construction Zone. Then it was on to Austin, TX, for a very different kind of meeting—more of a festival, really.

At this year’s SXSWEDU, Denis was one of three speakers on the panel, “Can Evidence Even Keep Up with Edtech?” The problem presented by the panel was that edtech, as a rapidly moving field, seems to be outpacing the rate of research that stakeholders may want to use to evaluate these products. How, then, could education stakeholders make informed decisions about whether to use edtech products?

According to Denis, the most important thing is for a district to have enough information to know whether a given edtech product may or may not work for that district’s unique population and context. Therefore, researchers may need to adapt their methods both to be able to differentiate a product’s impact between subgroups, as well as to meet the faster timelines of edtech product development. Empirical’s own solution to this quandry, Evidence as a ServiceTM, offers quick-turnaround research reports that can examine impact and outcomes for specific student subgroups, with methodology that is flexible but rigorous enough to meet ESSA standards.

Denis praised the panel, stating, “In the festival’s spirit of invention, our moderator, Mitch Weisberg, masterfully engaged the audience from the beginning to pose the questions for the panel. Great questions, too. I got to cover all of my prepared talking points!”

You can read more coverage of our SXSWEDU panel on EdSurge.

After the panel, a string of meetings and parties kept the energy high and continued to show the growing interest in efficacy. The ISTE meetup was particularly important following this theme. The concern raised by the ISTE leadership and its members—which are school-based technology users—was that traditional research doesn’t tell the practitioners whether a product is likely to work in their school, given its resources and student demographics. Users are faced with hundreds of choices in any product category and have little information for narrowing down the choice to a few that are worth piloting.

Following SXSWEDU, it was back to DC for the Consortium for School Networking (CoSN) conference. Denis participated in the annual Feedback Forum hosted by CoSN and the Software & Information Industry Association (SIIA), where SIIA—representing edtech developers—looked for feedback from the CIOs and other school district leaders. This year, SIIA was looking for feedback that would help the Empirical team improve the edtech research guidelines, which are sponsored by SIIA’s Education Technology Industry Network (ETIN). Linda Winter moderated and ran the session like a focus group, asking questions such as:

  • What data do you need from products to gauge engagement?
  • How can the relationship of engagement and achievement indicate that a product is working?
  • What is the role of pilots in measuring success?
  • And before a pilot decision is made, what do CoSN members need to know about edtech products to decide if they are likely to work?

The CoSN members were brutally honest, pointing out that as the leaders responsible for the infrastructure, they were concerned with implementability, bandwidth requirements, and standards such as single sign-on. Whether the software improved learning was secondary—if teachers couldn’t get the program to work, it hardly mattered how effective it may be in other districts.

Now, Denis is preparing for the rest of the spring conference season. Next stop will be New York City and the American Education Research Association (AERA) conference, which attracts over 20,000 researchers annually. The Empirical team will be presenting four studies, as well as co-hosting a cocktail reception with AERA’s school research division. Then, it’s back on the plane for ASU-GSV in San Diego.

Find more information about Evidence as a Service and the edtech research guidelines.

2018-03-26

Conference Season has Arrived

Springtime marks the start of “conference season” and Empirical Education has been busy attending and preparing for the various meetings and events. We are participating in five conferences (CoSN, SIIA, SREE, NCES-MIS, and AERA) and we hope to see some familiar faces in our travels. If you will be attending any of the following meetings, please give us a call. We’d love to schedule a time to speak with you.

CoSN

The Empirical team headed to the 2010 Consortium of School Networking conference in Washington, DC at the Omni Shoreham Hotel from February 28—March 3, 2010. We were joined by Eric Lehew, Executive Director of Learning Support Services at Poway Unified School District, who co-presented with us a poster titled, “Turning Existing Data into Research” (Monday, March 1 from 1:00pm to 2:00pm). As exhibitors, Empirical Education also hosted a 15-minute vendor demonstration entitled Building Local Capacity: Using Your Own Data Systems to Easily Measure Program Effectiveness, to launch our MeasureResults tool.

SIIA

The Software & Information Industry Association held their 2010 Ed Tech Government Forum in Washington, DC on March 3–4. The focus this year was on Education Funding & Programs in a (Post) Stimulus World and included speakers, such as Secretary of Education, Arne Duncan and West Virginia Superintendent of Schools, Steven Paine.

SREE

Just as the SIIA Forum came to a close, the Society for Research on Educational Effectiveness held their annual conference—Research Into Practice—March 4-6 where our chief scientist, Andrew Jaciw, and research scientist, Xiaohui Zheng, presented their poster on estimating long-term program impacts when the control group joins treatment in the short-term. Dr. Jaciw was also named on a paper presentation with Rob Olsen of Abt Associates.

Thursday March 4, 2010
3:30pm–5:00pm: Session 2
2E. Research Methodology
Examining State Assessments
Forum
Chair: Jane Hannaway, The Urban Institute
Using State Or Study-Administered Achievement Tests in Impact Evaluations
Rob Olsen and Fatih Unlu, Abt Associates and Andrew Jaciw, Empirical Education
Friday March 5, 2010
5:00pm–7:00pm: Poster Session
Poster Session: Research Methodology
Estimating Long-Term Program Impacts When the Control Group Joins Treatment in the Short-Term: A Theoretical and Empirical Study of the Tradeoffs Between Extra- and Quasi-Experimental Estimates
Andrew Jaciw, Boya Ma, and Qingfeng Zhao, Empirical Education

NCES-MIS

The 23rd Annual Management Information Systems (MIS) Conference was held in Phoenix, Arizona March 3-5. Co-sponsored by the Arizona Department of Education and the U.S. Department of Education’s National Center for Education Statistics (NCES), the MIS Conference brings together the people who work with information collection, management, transmittal, and reporting in school districts and state education agencies. The majority of the sessions focused on data use, data standards, statewide data systems, and data quality. For more information, refer to the program highlights.

AERA

We will have a strong showing at the American Educational Research Association annual conference in Denver, Colorado from Friday, April 30 through Tuesday, May 4. Please come talk to us at our poster and paper sessions. View our AERA presentation schedule to find out which of our presentations you would like to attend. And we hope to see you at our customary stylish reception Sunday evening, May 2 from 6 to 8:30—mark your calendars!

IES

We will be presenting at the IES Research Conference in National Harbor, MD from June 28-30. View our poster here.

2010-03-12

MeasureResults® to be Launched at CoSN 2010

Empirical Education will launch its web-based educational research solution, MeasureResults on March 1 at the Consortium for School Networking conference in Washington, DC. MeasureResults is a suite of online tools that makes rigorous research designs and statistical processes accessible to school systems and educational publishers who want to evaluate the effectiveness of products and services aimed at improving student performance.

“MeasureResults will change the way that school districts and product developers conduct rigorous evaluations,” said Denis Newman, Empirical Education President. “Instead of hiring outside evaluators or onsite research experts or statisticians, MeasureResults allows school district personnel to design a study, collect data, and review reports in our user-friendly online platform.”

MeasureResults grew out of a federally funded research project to develop a low-cost method for schools to conduct their own research. The product was developed for commercial distribution under a Small Business Innovation Research grant from the U.S. Department of Education. By moving the educational research processes online, MeasureResults makes school-run evaluations more efficient and less expensive.

2010-02-23

Poway Completes Study from MeasureResults Pilot

The results are in for Poway Unified School District’s first research study using our MeasureResults online tool. PUSD was interested in measuring the impact of CompassLearning’s Odyssey Reading program in the middle grades. Using an “interrupted time series” design with a comparison group, they found that both 7th and 8th grade students averaged 1 to 2 points higher than expected on the NWEA MAP Literacy assessment. PUSD plans to continue their evaluation of CompassLearning Odyssey in different subject areas and grade levels. Join us in D.C. at this year’s CoSN conference on March 1, 2010 as Eric Lehew, Executive Director of Learning Support Services at PUSD, presents findings and reflections on the process of using MeasureResults to conduct research at the local district level.

Click here to download a copy of the PUSD achievement report.

2010-02-12
Archive