More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

This is a follow-up to the other JHU/Education Industry Association blog post from April 10.  The point of both posts?  JHU is being paid to evaluate the efficacy of STAT, but is representing ed-tech vendors vs. looking out for the interests of BCPS students. The evaluation is not rigorous or independent.

In fact, in reading Dr. Morrison’s full CV in the earlier post, one sees the connection to the DreamBox Learning program being used by BCPS to teach elementary school math:

Co-Principal Investigator (2015 – 2016). Efficacy Study of DreamBox Learning Math. DreamBox. Ross, S. M., Principal Investigator.

Morrison, J. R., Ross, S. M., Reilly, J. M., & Cheung, A. C. K. (2016). Retrospective Study of DreamBox Learning Math. Report to DreamBox.

National Blogger Peter Greene of Curmudgucation writes about EIA and JHU working together to better market ed-tech products:  Naked Education Profiteering

This JHU Press Release of March 2012 announces the JHU-EIA Partnership:

“The Johns Hopkins University School of Education and the Education Industry Association today announced a partnership building on their individual strengths in educational instruction and reform.”

“Together, the School of Education and EIA, a trade association representing private providers of education services, will create a center for education innovation and entrepreneurship; facilitate relationships between EIA member companies and the School of Education; integrate for-profit programs, products and concepts more deeply into the education sector; and create joint research and education programs.”

“We strongly believe that our school must develop new programs and partnerships with all components of the education sector in order to achieve our vision of realigning our profession and advancing education reform nationwide,” said David W. Andrews, dean of the School of Education. “Forming this strategic partnership with EIA will help the for-profit and not-for-profit education sectors learn from each other, and better enable us to work together for the betterment of all aspects of education.”

JHU School of Education, Center for Research and Reform in Education’s (CRRE) Dr. Steven Ross (the main STAT evaluator) wrote this article for EIA:  Demonstrating Product Effectiveness:  Is Rigorous Evidence Worth the Rigors?  Here are some highlights from Dr. Ross’ article:

“Because providers strongly believe in what they do, most feel confident that a rigorous evaluation study would present their products in a positive light. The challenge is how to commission and fund such studies. Is striving for the ostensible gold standard– a “randomized controlled trial” (RCT) with a large number of schools, student-level test scores, and all the other trimmings really needed? Such studies are usually quite expensive (think six figures!) to fund. Trying to obtain a federal grant (e.g., “Investing in Innovation” or “i3”) can involve extensive proposal preparations, with steep odds of being selected, and even for the lucky winners, a long wait until the results can be released.”

“My recommendation is to pursue such opportunities where the fit is good and the chances for competing solidly seem strong. But keep in mind that gold-standard studies may actually be “fool’s gold” for many providers. Unless a product is fully developed and delivered in high dosage to students (not as a learning supplement or a support for teachers), it’s quite difficult to show measureable student gains given all the noise (confounding) of so many other classroom, student, and teacher variables. And, as promised above, it seems instructive to take heed of what the district stakeholders said about rigorous evidence in interviews: They rarely read research journals or check out (or even know about) the What Works Clearinghouse (WWC) for research reviews. However, they very much value that a credible third-party evaluator conducted a systematic study of the product being sold. They value evidence of student achievement gains, but with the caveat that the study conditions and the schools involved may be quite different from their own.”

“In our evaluation work with providers, we try to fit the study to the application goals and maturity of the particular product … All of these studies offer the providers potentially useful formative evaluation feedback for program improvement as well as findings from a reasonably rigorous independent study to support and differentiate their products.”

JHU CRRE’s Dr. Ross and Dr. Morrison presented the STAT year-end report at the 7/14/15 BOE meeting (minutes 2:06 to 2:38). Here are the report and evaluation from the BCPS website:

STAT Year-End Evaluation (2014-15)

STAT Year-End Report (2014-15)

Video Highlights from 7/14/15 meeting:

Dr. Ross: “Over time, year two, year three … if things work as they should, you’re gonna be seeing significant improvement in students’ mastery of P21 skills … years 3, years 4 there should be increases in MAP, increases in PARCC …”

BOE Commentary at meeting (paraphrased): the data presented by JHU was a “little lethargic” and, considering the investment in personnel, training, and curriculum based around the digital devices, the BOE expected “to see Dr. Morrison’s bar charts move in the right direction.”

As reported by EdSurge on April 7, 2016, the assets of EIA are being taken over by the Education Technology Industry Network (ETIN), the education division of the Software & Information Industry Association (SIIA).  The article talks of the above-noted JHU-EIA partnership created in 2012.

“Other assets that Billings’ team will inherit from EIA include a partnership with John Hopkins University to support a “joint center for education innovation and entrepreneurship.” EIA has also worked with Digital Promise to publish reports on barriers to technology procurement in K-12 districts.”

The above-mentioned EIA-Digital Promise partnership includes JHU, which wrote a study for them, Fostering Market Efficiency in K-12 Ed-tech Procurement. A key finding is that there “are no readily accessible sources of “rigorous” evidence on the effectiveness of the vast majority of ed-tech products. As a result, school districts largely depend on recommendations from peers and from their own teachers and principals who have familiarity with the products.”

2 thoughts on “More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

  1. This is some pretty awful corruption that you’ve found. The people selling their technology programs to schools are completely shameless, and they have fortunately documented their own lack of integrity. I am pretty much speechless after reading this.

    Like

Leave a comment