An interesting US Department of Education resource is the
What Works Clearinghouse website (http://ies.ed.gov/ncee/wwc/)
established in the early 2000s as a repository for valid research studies on effective educational practices. The site is intended as a “resource for informed decision making” and “identifies studies that provide credible and reliable evidence of the effectiveness of a given practice, program, or policy.”
In a time when the word rigor is thrown around by school administrators and edtech companies alike, it is safe to say that the WWC’s standards for vetting research studies are indeed rigorous. There is a “fact check” section to counter the idea that “the WWC never finds evidence of positive effects” in their research reviews. They do…they just only consider research that involves “high-quality evidence” as determined by their very high standards for research design; once a research study is accepted, they find about 70% of them demonstrate some positive effect. The WWC looks at three dimensions of a study to determine validity: methodology employed, data collected, and statistics used. Many studies are deemed ‘not valid’ due to research design issues, narrow interpretations of data, or other flaws.
Their site has a feature that allows users to search their research results database by keyword. A search under the keyword “DreamBox” (software used by students in BCPS elementary schools) brings up ten of the case studies posted by DreamBox Software on their website in support of their product…all of which were rejected by the WWC because they do “not use a comparison group design or a single-case design”. The first study listed on the BCPS STAT bibliography of research website can also be found in the WWC database; its validity was rejected because “it uses a quasi-experimental design in which the analytic intervention and comparison groups are not shown to be equivalent.” Unfortunately, the other supporting research items on the STAT site have not been vetted by the WWC.
Overall it’s a great resource for looking at interesting research studies (it’s searchable by topics in education, including Education Technology) and checking to see which ones do not make the cut.
Also, a key study cited by the STAT FAQ as evidence that the program’s approach is “beneficial to students” has been reviewed by the National Education Policy Center, which determined that “broad conclusions about the efficacy of technology-based personalized learning, however, are not warranted by the research.”
The study, “Continued Progress: Promising Evidence on Personalized Learning,” published by the Rand Corporation and funded by the Gates Foundation, was criticized on several fronts: “Limitations include a sample of treatment schools that is unrepresentative of the general population of schools, the lack of a threshold in the study for what qualified as implementing “personalized learning” in the treatment schools, and the reality that disruptive strategies such as competency-based progression, which require the largest departures from current practice, were rarely implemented in the studied school.”
Below are links to the BCPS annotated bibliography and STAT FAQ page:
(Note: no research from early-childhood educators included in either link)
Below is an excerpt from a parent letter also referencing the RAND study (you can read the whole letter in Parents Do Their Homework: an Engineer and a Professor on STAT ):
BCPS points to a RAND study published in November 2015 as evidence for the efficacy of “personalized learning”. (See: the “Continued Progress” download at http://www.rand.org/pubs/research_reports/RR1365.html)
First of all, this study was not published until just recently, so it could not have been part of Dance’s original decision to launch a program like STAT. More importantly, this RAND study was not performed with “regular” randomly-selected schools: it was performed with schools that were funded by the Bill and Melinda Gates Foundation. And some of the schools were completely new.
From page 3 of the RAND study:
“All of the schools received funding from the Gates Foundation, either directly or through intermediary organizations, to implement personalized learning practices as part of at least one of the following three foundation-supported initiatives: Next Generation Learning Challenges (NGLC), Charter School Growth
Fund’s Next Generation School Investments, and the Gates Foundation’s Personalized Learning Pilots.”
The “Methods and Limitations” section of the RAND report enlightens us as to why the results of this report are probably not applicable to a school system like BCPS:
From page 6:
“Despite the increased interest in personalized learning, the field lacks evidence about its effectiveness. This study is designed to address this need using the most rigorous method that can be applied to the foundation-funded set of schools.”
Also from page 6: “In particular, given the implementation design for the portfolio of personalized learning schools in the study, it was not possible to create randomly assigned treatment and control groups; nor did we have access to data from neighboring schools that might have matched the personalized learning schools.”
And again from page 6: “As new schools, they lack a history of data from before they began implementing personalized learning, which would have enabled other analytic methods for determining achievement effects.”
Page 14 reveals that these schools were not operated like normal schools:
“Most schools had extended school days or school years, and the extra time was used primarily for additional instruction or to provide individualized support. “ Also, nowhere in the report do they discuss the possible effects of class sizes on achievement. Where is the data for class size? Oddly, these funded schools also spent more time taking their achievement tests, as noted on page 41.
There were a handful of funded district schools involved in the study, but page 13 tells us the effects of personalized learning on those district type schools were not very impressive. “Although two of the district schools produced significant positive results, this was offset by negative results in three other district schools…”
So maybe the only line in this entire RAND study that is actually relevant to BCPS is that one little fact we already read about in its “Methods and Limitations” section located on page 6: “Despite the increased interest in personalized learning, the field lacks evidence about its effectiveness.”