Johns Hopkins University’s Mid-Year STAT Evaluation

STAT Year Three Mid-Year Evaluation Presentation by the Johns Hopkins Center for Research and Reform in Education (CRRE), which is working under a 2014-19 $711,000 contractwith the Baltimore County Public School system.

STAT Year Three, Mid-Year Evaluation Report by various CCRE researchers, some of whom also conduct reports for edtech industry companies and organizations; That includes a $80,000 study for DreamBox Learning Math, which is also a BCPS vendor with a nearly $1.2 million contract, set to be expanded. “Co-Principal Investigator (2015 – 2016). Efficacy study of DreamBox Learning Math. DreamBox ($80,000).”

February 16, 2017 Baltimore County Board of Education Curriculum and Instruction Committee meeting at which the evaluation was presented.

During the meeting, which was live streamed (see link above), BOE Member Ann Miller posted on her Facebook page: BCPS Board Member Ann Miller (NOTE:  LH = Lighthouse, the schools where STAT is piloted):

BCPS Board Member Ann Miller
BCPS Board Member Ann Miller Gilliss: What is explanation for performance changes? A: results are not statistically significant. LH schools were slightly more economically disadvantaged. Learning curve. Not enough data to show but encouraging on PARCC compared to state.
BCPS Board Member Ann Miller
BCPS Board Member Ann MillerGilliss: should we expect continued growth as we go forward? A: PARCC we are looking at LH schools in Y2. I would expect performance to be still low for new program. These results aren’t saying STAT was effective in achievement, but does say STAT didn’t interfere in achievement in Y2.

BCPS’ New Grading Policy: Part of the Big Personalized Learning Plan

Bottom Line Up Front (but read to the end for important background information):

Due to an outcry from students and parents (and we hope teachers and administrators behind the scenes), the recently revamped BCPS Grading and Reporting Procedures were amended as of 11/1/16.  Here are the changes and a related article: Towson Flyer: Baltimore County schools amending new grading policy

Grading Policy Amendment

Grading Policy Amendment

These amendments were published directly after the 10/31 forum on the new policy held by BCPS Community Superintendents.  Principals and certain parents were invited to attend and provide feedback.

The Rest of the Story

As noted in our one blog post for the month of June (it was the summer!), the BCPS grading policy underwent a major revision effective 7/1/16.

In early August, schools reached out to parents to explain that in 2014 (when STAT was implemented), a grading and reporting committee made up of parents, teachers, and administrators:

” … reviewed grading and reporting practices from across the state and the nation. Based on the information gathered, the committee determined the policy needed to be rewritten to reflect more current research-based practices to better align your child’s grades with his/her achievement of grade-level standards. To that end, the new Board of Education Policy 5210 Grading and Reporting was approved in June of 2015 for full implementation beginning August, 2016.”

New Policy and Rule 5210

” … all student grades will align to identified course or grade-level standards and be based on a “body of evidence.” A body of evidence is simply the information a teacher collects to determine a student’s level of performance. In addition to making sure grades are based on evidence aligned to standards, (BCPS) wants to ensure that the purpose for assigning grades is clear and consistent across all schools. To do this, BCPS established that the primary purpose for determining marking period grades is to accurately communicate a student’s level of achievement in relation to the course expectations at a given point in time.”

(NOTE: This is key to Mastery-Based Education and computer-delivered curriculum)

“The school system commits to providing equitable, accurate, specific, and timely  information regarding student progress towards course expectations which includes feedback to you and your child in order to guide next steps and indicate areas growth areas. To promote alignment to research-based practices and stakeholder input, the committee oversaw the creation of a procedures manual, which is broken down into six guiding practices:

  1. Grading practices must be supportive of student learning.
  2. Marking-period grades will be based solely on achievement of course grade-level standards.
  3. Students will have multiple opportunities to demonstrate proficiency.
  4. Grades will be based on a body of evidence aligned to standards.
  5. A consistent grading scale will be used to score assignments and assessments.
  6. Accommodations and modifications will be provided for exceptional learners.”

This sounds somewhat reasonable and child-centered in theory, except for the fact that ASCD is all over this Research & Rationale, which makes them suspect:

https://www.bcps.org/academics/grading/researchRationale.html

The Sun wrote an article about it, as did the Towson FlyerDr. Dance felt obliged to write an op-ed in the Sun.  BCPS devoted a website page to it; highlights included a video and a MythBusters List.

The New BCPS Grading and Reporting Policy is Explained

As the school year rolled out, unprepared teachers, parents, and students began to realize what was going on and were not happy.  One parent started a petition to rescind the new grading procedures.  Another parent wrote a must-read op-ed about it: Towson Flyer: What’s Behind BCPS’ New Grading Policy?

National ed-blogger Emily Talmage has written about grading policies like BCPS’:  Is Your District Changing its Grading Policy? Here’s the Real Reason Why.

Take the time to read the Towson Flyer op-ed and Talmage’s piece; you’ll understand why the BCPS Grading and Reporting Policy had to change to enable “anytime, anywhere learning.”

Also read this from iNACOL, the International Association of K-12 Online and Blended Learning.  iNACOL has a baby named Competency Works, which offered a detailed report on grading changes needed for Competency-based Education (STAT).

Any school that has begun the journey toward competency education, breaking free of the limitations of the time-based system, will eventually come face-to-face with grading policies and practices. Along with the excitement of creating a new grading system that ignites a dynamic culture of learning will come opportunities to engage students, families and the community in creating a shared vision about the purpose of school. Challenging the traditional system of grading practices, rooted firmly in the American culture with its exhilarating A+ to the dreadful F, will prompt questions, fears, and misconceptions. There are likely to be lively community meetings and even a letter or two in the local newspaper. There will also be the mutual delight when a competency-based grading system is put into place that allows students and teachers to work together toward a shared vision of learning. Most importantly, there will be cause to celebrate as students make progress toward proficiency.”

Mutual delight?

Advice to BCPS Parents from “Wrench in the Gears” and Why iNACOL Loves ESSA

Recent days have seen an uptick in conversations about online Competency-based Education or CBE, the scary wave of educational transformation rapidly sweeping over the country.  BCPS students, teachers, and parents are at the front edge of this wave with STAT. 

Here is a post by a parent of a public school student who advocates for doing much more than just opting out of end-of-the-year tests.

From Wrench in the Gears (A Skeptical Parent’s Thoughts on Digital Curriculum):  Stop! Don’t opt out. Read this first.

National education expert Diane Ravitch recently linked to the blog.

One of the main “benefits” of our 1:1 initiative, according to Dr. Dance, is that it would allow children to be assessed anytime, anywhere. We’re spending millions on contracts to use and sometimes develop computer-based assessments at the end of every unit.

If you have any doubts about whether the No Child Left Behind (NCLB) replacement, the Every Student Succeeds Act (ESSA), is ripe for computer-based personalized learning assessments, iNACOL, the International Association for K-12 Online Learning, a major trade group, and its partners love ESSA.  Review the slides from this recent webinar hosted by the iNACOL president, iNACOL’s VP for Federal and State Policy, and KnowledgeWorks’ Senior Director of National Policy and you’ll begin to understand why.

During a keynote presentation at iNACOL’s annual meeting, our own Superintendent said:

“The other conversion was this whole idea around the assessment conversion.  There’s a lot of talk around the country about that right now.  Let’s get away from this idea of paper and pencil, you know, multiple-choice assessments.  How do we assess our students without even stopping class, space and time to do that?  Great teachers do this all the time with formative assessments.  But, we also know, in order to personalize learning for young people, we should be able to assess students at any moment, to figure out what level they’re on, what standards they’ve mastered, so they can move along the continuum as [sic] appropriately.”

Watch here. Go to minute 33.

Read, share these links, ask questions, and follow the suggestions from “Wrench in the Gears” that already apply to those of us in BCPS:

~ If your school offers a device for home use, decline to sign the waiver for it and/or pay the fee.

What happens if you don’t sign the waiver for middle and high school?  BCPS needs to make that clear.  We also have elementary students using a 1:1 (that means their own) device at school in first grade!   Many parents are totally unaware how much time students are spending with it, or what they are doing.  Turns out, BCPS leadership doesn’t know how much time students are spending on it either (at approximately 1:00, we hear that there’s “very limited research” on safe screentime in an educational context)!

~ Refuse to allow your child’s behavioral or social-emotional data to be entered into third-party applications. (e.g. Class Dojo)

Ask questions about all the third-party applications being used in BCPS.  Class Dojo tracks behavior.  Check out whether Common Sense Media’s privacy evaluation team has rated the applications. Subscribe to the Parent Coalition for Student Privacy’s blog and check out their back-to-school advice.

~ Refuse in-class social networking programs (e.g. EdModo).

We’re curious about how this is being used in BCPS classrooms and what other social networking software is used.  In general, parents should be very cautious about introducing social media to children – BCPS’ own advice for parents says so.  Parents should have a say about when and how their children are introduced to social networking for school.

~ Set a screentime maximum per day/per week for your child.

Research has shown that when children are spending more than a half-hour per day on the computer, learning outcomes are worse.  The evaluation of STAT thus far has NO data on learning outcomes.  Read the JHU STAT reports here. Ask for homework alternatives that do not require use of a computer.  Ask for textbooks so that reading can be done without more time on the computer.

~ Opt young children out of in-school screentime altogether and request paper and pencil assignments and reading from print books (not e-books).

Parents Across America (PAA), a grassroots, non-partisan organization, has a number of useful linksHere are some questions to ask your school.

~ Begin educating parents about the difference between “personalized learning” modules that rely on mining PII (personally-identifiable information) to function properly and technology that empowers children to create and share their own content.

Dreambox and iReady, so-called “personalized learning” software, are being used in BCPS.  Neither empowers children to create their own content.  See this link on iReady, and this one; this link concerns Dreambox.  Look in BCPSone.  Ask your kids.  Ask your teachers and principals.  What else are they using?  Log in at home with your child if you can and check it out – if you don’t have access to a computer at home, ask your school to show you the programs in action.  You have a right to know what your child is doing at school.

~ Insist that school budgets prioritize human instruction and that hybrid/blended learning not be used as a backdoor way to increase class size or push online classes.

The County Auditor’s report of 2015 notes that class sizes have increased with the implementation STAT.  STAT teachers used to be classroom teachers – they are no longer, instead focusing on professional development.  Hybrid and blended learning have a host of definitions, but here are some examples of how it is playing out so far for kids as young as first grade in BCPS. 

http://lighthouse.bcps.org/reflections/february-26th-2016

http://lighthouse.bcps.org/reflections/flipped-learning-to-differentiate

As Dr. Dance says:

“Most of the nation’s classrooms have about 30 students in them. How can a teacher personalize and customize unless you leverage technology?  In BCPS we have five-year journey to go 1:1 in grades K-12 to where every single kid has a device.” 

But wait.  Respected education policy center NEPC at the University of Colorado says:

“Smaller classes are particularly effective at raising achievement levels of low-income and minority children.”

STAT: Year Two Mid-Year Evaluation Report

“There are three kinds of lies: lies, damn lies, and statistics.”

–Mark Twain

The Year Two Mid-Year Evaluation Report on the Baltimore County Public Schools STAT (Students and Teachers Accessing Tomorrow) initiative was recently released by the Johns Hopkins School of Education’s Center for Research and Reform in Education, and the resulting 69-page document would not disappoint Mr. Twain.

The report, as one might expect from JHU, is clearly written and reasonably thorough, with data parsed, presented, and charted as needed. The report opens by explaining that its purpose is to evaluate “implementations and outcomes” of the STAT program, “…relating to the goals of improving student achievement and preparing globally competitive students” (page 3). However, the very next paragraph clarifies that no, the report “does not examine the achievement of student outcome goals” (3) but rather presents information on the level of professional development offered and a host of “measureable outcomes” (3) from classroom observations.  The report offers nothing about pedagogical effectiveness, the thing that actually improves student achievement. It also leaves many larger questions unanswered, however, providing a scrim of meaningless data to stand in as proof of effectiveness for a pedagogically dubious program.

The information on professional development in the STAT program was gathered through teacher surveys, and the results are obvious: there has been additional and broader professional development opportunities provided to teachers in Lighthouse Schools, and a majority of teachers have taken advantage of those opportunities. It would be foolhardy to roll out a multi-million dollar initiative like STAT without some kind of training, and the report finds that yes, there has been training offered, in large, small, and one-on-one settings. However, the stickier questions are not even asked: what kind of professional development was completed? How effective was it for classroom practice? What were the goals? How were they met?

In the section on measurable outcomes, the results of several classroom observations provided data on classrooms, teacher practice, digital content, student engagement, and P21 skills. Now, just because something can be measured does not make it a valuable metric. Take this example from the classroom environment findings: a “majority of classrooms observed in fall 2015 were physically arranged to support collaborative learning, displayed materials to support independent thinking to some extent, and had materials referencing the general subject or content area being taught” (25).  What is described here is basically a standard classroom; this is expected practice in K-12 environments, as no caring teacher anywhere ever left a drab room of blank walls when working with children.  This so-called “measurable outcome” tells nothing about STAT effectiveness; it’s a good bet that a majority of classrooms were that way before the program even existed. What was interesting in this section, however, was the finding that “students may be less likely to move around the room…considering the availability of information and resources accessed through devices” (25). This is certainly not a positive finding, though proponents of the “just ask Siri” school of research might disagree. What is implied here is that students do not move around much, as they supposedly can get what they need from the screen in front of them. This is not school, this is training for dystopia.

In examining teacher practice, the report found that “nearly all classroom teachers exhibited coaching behavior with students at least occasionally” (28). This is also a measurement of little meaning, as nearly all teachers who work with students in general spend some amount of time in coaching behaviors, teaching behaviors, and other required classroom roles.  Maybe a few might hide behind their desk all day, or perhaps even under it, but these metrics were not included.

Perhaps the most useless metric in the entire report is the one involving digital content. The information was provided by Engrade, the McGraw-Hill property that created the software platform on which BCPS One sits; it is clear they have been logging a copious amount of student data, as they regurgitated some of it for the report to state the obvious: teachers and students in Lighthouse Schools are accessing digital content more frequently. The creation of teacher tiles (program links) for BCPS One increased; teachers in Lighthouse Schools are almost certainly required to be using the platform, so it is little surprise that they have been.  What is surprising is the equating of “student engagement” with “increased student tile views within BCPS one” (47).  Essentially, there have been more teacher and student clicks (of a mouse or browsing button), which tells absolutely nothing about the quality of material that is being clicked upon. Maybe it’s whack-a-mole. Maybe it’s spam. But hey, there’s a lot of clicking going on, so it must be good.

Measuring teacher and student clicks and passing it off as a useful metric is absurd. Clicks tell nothing about quality of materials used or quality of learning outcomes; this is a prime example of being data rich yet content and context poor.

The final section of the evaluation examined the use of P21 skills, which include  “problem solving, project-based approaches to instruction, inquiry-based approaches to instruction, and learning that incorporates authentic/real world contexts.” It is important to stop for a second here and note that these ideas do not need to be branded with the Partnership for 21st Century Learning “P21” moniker. These ideas are not new to the 21st century and stretch back to the truly innovative theories of John Dewey and genuine progressive educational thought (which should not be confused with modern “progressive” education that advocates high-stakes testing and computer-drive personalized learning). The Partnership for 21st Century, a lobbying group for educational technology business interests, has glommed onto these ideas with the hope that they will lend some veracity to their organization. They don’t.

It is interesting to note, however, that the STAT report found that “P21 skills were least frequently observed overall” (42) out of all the metrics examined; perhaps the classroom focus has relied too much on technology and devices, crowding out more pedagogically effective methods such as student collaboration, problem-based learning, and other more engaging practices.

It is also worth noting that for an APA-style document, the year two midyear STAT report does not present a single reference or citation. Perhaps this is by design or request, or perhaps it is because the ideas that underpin the STAT initiative have a poor or nonexistent research base. The report presented a whopping three sentences of recommendations for improvement of the program, to include a focus on professional development “specific to desired teaching and learning activities that are less frequently practiced” (48) and a clarification of the role of the STAT teacher.

This issue of clarification was raised by a section in the report that noted a theme of lack of trust of the STAT teacher from some teacher survey responses. A few survey responses were quoted, which included: “the STAT teacher at our school has become evaluative and administrative in nature. It’s very clear that things shared/things seen in classrooms are shared with administration and hold weight”; “many teachers are concerned as to whether STAT teachers are going back to administrators and telling them about problems in the classroom. Are they judges or mentors?”; “she does not keep confidentiality about what we are working on…I am NOT going to ask for help because it is reported to the principal and spoken about later as a weakness” (22). These comments speak volumes about what is left unsaid by the STAT report—the BCPS administration does not operate the program on a principle of support but rather on one of threat and expected compliance.

Baltimore County Public Schools STAT Evaluation Summary on the BCPS site- you can see the links to the reports at the bottom of the page.

As requested by a reader in the comments, you may also be interested in the complicated relationship between those who wrote the STAT Evaluation Summary and those who pay these same evaluators, otherwise known as conflict of interest:

Johns Hopkins University: Certification for Sale

More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

Are BCPS Students iReady for English-Language Computer Learning?

A retired teacher, Anne Groth, recently took a look at iReady, which is reportedly up for consideration for roll-out in elementary and middle schools next year, pending approval by the Board of Education.  https://teachingafter60.wordpress.com/2016/04/26/are-you-ready-for-iready/#comment-26

She notes that the website claims:  “Research proves that i-Ready can deliver transformational results for all students.”      Read about it here.

WOW.  That’s quite a claim. 

Here’s the “independent” evaluation of this program found on the iReady website by a for-profit company.  The report did not have peer review.  http://www.casamples.com/downloads/ReadyNYEfficacyStudy512.pdf

The problems with this “research” study are many.  First, any research study worth reading includes a limitations section – that is where independent researchers make clear to the reader that they recognize the limitations of their research, and what the limitations mean for the conclusions and implications of the research, and ways of combating these limitations in future research.

In this report, there is no limitations section anywhere.   Instead, these researchers claim “In summary, the study demonstrated unequivocally (emphasis ours) that the use of the Ready program resulted in statistically higher performance on the New York State Tests.”

These researchers don’t understand confounding, or causality.  You cannot compare schools who chose to use a new program to schools who did not, and claim anything about causality “unequivocally.” These folks would be sent back to Biostatistics and Epidemiology 101.

Link to http://www.psychologyinaction.org/2011/10/30/what-is-a-confounding-variable/

This is more marketing than research.

Here’s what another blogger explained about educational research (Recent Dreambox post).

“An interesting US Department of Education resource is the What Works Clearinghouse website (http://ies.ed.gov/ncee/wwc/) established in the early 2000s as a repository for valid research studies on effective educational practices. The site is intended as a “resource for informed decision making” and “identifies studies that provide credible and reliable evidence of the effectiveness of a given practice, program, or policy.” In a time when the word rigor is thrown around by school administrators and edtech companies alike, it is safe to say that the WWC’s standards for vetting research studies are indeed rigorous. There is a “fact check” section to counter the idea that “the WWC never finds evidence of positive effects” in their research reviews. They do…they just only consider research that involves “high-quality evidence” as determined by their very high standards for research design; once a research study is accepted, they find about 70% of them demonstrate some positive effect. The WWC looks at three dimensions of a study to determine validity: methodology employed, data collected, and statistics used. Many studies are deemed ‘not valid’ due to research design issues, narrow interpretations of data, or other flaws.”

There’s nothing in the WWC on iReady, or Curriculum Associates. 

These reviews of iReady are worth a look – written by largely digitally savvy students, who don’t seem too enthusiastic.  http://curriculum-associates.pissedconsumer.com/i-ready-not-20150930708513.html

One more thing – this program is another one that children will do on computers, with headphones, as a solitary activity. While you are actually doing this program, there’s no interaction with your teacher or other students. There’s nothing collaborative about this.  This isn’t building a mechanical flower for earth day.

iReady means time spent on a computer, by yourself, with headphones on, learning, and being tested, via animated characters. As Anne Groth points out, learning to write is inherently collaborative: “The communication between teacher and student, among students, and at home when writing is shared with parents is just something a computer program cannot offer.”  

How much time will students be asked to spend on iReady per week in first grade?  In fourth?  In sixth?

If it is a lot of time, that’s worrisome, as I wonder, what educational practice is being decreased to find the time?

If it isn’t much time, is it worth the cost per student?  We would like to know, what do our teachers really want in the classroom?

More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

This is a follow-up to the other JHU/Education Industry Association blog post from April 10.  The point of both posts?  JHU is being paid to evaluate the efficacy of STAT, but is representing ed-tech vendors vs. looking out for the interests of BCPS students. The evaluation is not rigorous or independent.

In fact, in reading Dr. Morrison’s full CV in the earlier post, one sees the connection to the DreamBox Learning program being used by BCPS to teach elementary school math:

Co-Principal Investigator (2015 – 2016). Efficacy Study of DreamBox Learning Math. DreamBox. Ross, S. M., Principal Investigator.

Morrison, J. R., Ross, S. M., Reilly, J. M., & Cheung, A. C. K. (2016). Retrospective Study of DreamBox Learning Math. Report to DreamBox.

National Blogger Peter Greene of Curmudgucation writes about EIA and JHU working together to better market ed-tech products:  Naked Education Profiteering

This JHU Press Release of March 2012 announces the JHU-EIA Partnership:

“The Johns Hopkins University School of Education and the Education Industry Association today announced a partnership building on their individual strengths in educational instruction and reform.”

“Together, the School of Education and EIA, a trade association representing private providers of education services, will create a center for education innovation and entrepreneurship; facilitate relationships between EIA member companies and the School of Education; integrate for-profit programs, products and concepts more deeply into the education sector; and create joint research and education programs.”

“We strongly believe that our school must develop new programs and partnerships with all components of the education sector in order to achieve our vision of realigning our profession and advancing education reform nationwide,” said David W. Andrews, dean of the School of Education. “Forming this strategic partnership with EIA will help the for-profit and not-for-profit education sectors learn from each other, and better enable us to work together for the betterment of all aspects of education.”

JHU School of Education, Center for Research and Reform in Education’s (CRRE) Dr. Steven Ross (the main STAT evaluator) wrote this article for EIA:  Demonstrating Product Effectiveness:  Is Rigorous Evidence Worth the Rigors?  Here are some highlights from Dr. Ross’ article:

“Because providers strongly believe in what they do, most feel confident that a rigorous evaluation study would present their products in a positive light. The challenge is how to commission and fund such studies. Is striving for the ostensible gold standard– a “randomized controlled trial” (RCT) with a large number of schools, student-level test scores, and all the other trimmings really needed? Such studies are usually quite expensive (think six figures!) to fund. Trying to obtain a federal grant (e.g., “Investing in Innovation” or “i3”) can involve extensive proposal preparations, with steep odds of being selected, and even for the lucky winners, a long wait until the results can be released.”

“My recommendation is to pursue such opportunities where the fit is good and the chances for competing solidly seem strong. But keep in mind that gold-standard studies may actually be “fool’s gold” for many providers. Unless a product is fully developed and delivered in high dosage to students (not as a learning supplement or a support for teachers), it’s quite difficult to show measureable student gains given all the noise (confounding) of so many other classroom, student, and teacher variables. And, as promised above, it seems instructive to take heed of what the district stakeholders said about rigorous evidence in interviews: They rarely read research journals or check out (or even know about) the What Works Clearinghouse (WWC) for research reviews. However, they very much value that a credible third-party evaluator conducted a systematic study of the product being sold. They value evidence of student achievement gains, but with the caveat that the study conditions and the schools involved may be quite different from their own.”

“In our evaluation work with providers, we try to fit the study to the application goals and maturity of the particular product … All of these studies offer the providers potentially useful formative evaluation feedback for program improvement as well as findings from a reasonably rigorous independent study to support and differentiate their products.”

JHU CRRE’s Dr. Ross and Dr. Morrison presented the STAT year-end report at the 7/14/15 BOE meeting (minutes 2:06 to 2:38). Here are the report and evaluation from the BCPS website:

STAT Year-End Evaluation (2014-15)

STAT Year-End Report (2014-15)

Video Highlights from 7/14/15 meeting:

Dr. Ross: “Over time, year two, year three … if things work as they should, you’re gonna be seeing significant improvement in students’ mastery of P21 skills … years 3, years 4 there should be increases in MAP, increases in PARCC …”

BOE Commentary at meeting (paraphrased): the data presented by JHU was a “little lethargic” and, considering the investment in personnel, training, and curriculum based around the digital devices, the BOE expected “to see Dr. Morrison’s bar charts move in the right direction.”

As reported by EdSurge on April 7, 2016, the assets of EIA are being taken over by the Education Technology Industry Network (ETIN), the education division of the Software & Information Industry Association (SIIA).  The article talks of the above-noted JHU-EIA partnership created in 2012.

“Other assets that Billings’ team will inherit from EIA include a partnership with John Hopkins University to support a “joint center for education innovation and entrepreneurship.” EIA has also worked with Digital Promise to publish reports on barriers to technology procurement in K-12 districts.”

The above-mentioned EIA-Digital Promise partnership includes JHU, which wrote a study for them, Fostering Market Efficiency in K-12 Ed-tech Procurement. A key finding is that there “are no readily accessible sources of “rigorous” evidence on the effectiveness of the vast majority of ed-tech products. As a result, school districts largely depend on recommendations from peers and from their own teachers and principals who have familiarity with the products.”

Johns Hopkins University: Certification for Sale

NOTE: This information was found by way of a comment left by Dr. Laura H. Chapman, an educator and education researcher, on the STAT-us BCPS post of March 22.  You can see the comment at the bottom of that post.

DID YOU KNOW that an ed-tech vendor can pay the Johns Hopkins School of Education to certify the efficacy of a product or service?

The Education Industry Association (EIA), which has the strategic goal to “support the role of the private sector in public education” and works to expand business opportunities for education entrepreneurs in PreK-12 markets, has partnered with JHU to offer certifications.

EIA notes that the “vibrant” PreK-12 education industry is “poised for explosive growth … in fact, education is rapidly becoming a $1 trillion industry, second in size only to the healthcare industry, and represents 10 percent of America’s GNP. Federal, state and local expenditures on education exceed $750 billion.”

THIS IS THE EIA HOMEPAGE:

NEW MARKETING STRATEGY!

EIA members can now certify
their services through
Johns Hopkins University!

alt

Johns Hopkins University School of Education is now offering Program Design Reviews for EIA Members!
Dear EIA Members and Potential Members:

Strong entrepreneurial education companies are constantly seeking new ways to market and promote their products and services. Proving the efficacy of your product or service is the single best way to attract new customers, making the “procurement process” much simpler.”

“EIA is now offering an amazing opportunity for its current members and for those wishing to join the Association. Beginning immediately, for a very small investment, EIA members can utilize the services of the Johns Hopkins School of Education (JHU). The team at JHU is offering program design reviews at an extremely discounted rate exclusively for EIA members. There are multiple levels of review your company can participate in, based on your budget and desired review level.”

“I know this might seem a bit intimidating, but, trust me; it is well worth your time and investment. Can you imagine walking into a Superintendent’s office armed with a positive outcome report by none other than the Johns Hopkins School of Education?! Do you think your competitors will have this feather in their cap? The answer is a resounding NO!”

“Picture your new marketing campaign that features your positive outcome with the Johns Hopkins School of Education! And most importantly, imagine what you will learn about your own product or service and the best ways to continually improve in order to produce the best educational outcomes for your students. You actually owe it to yourself, to your investors, and to your students to participate in this incredible opportunity to bring further legitimacy to your company.”

“As you work with the team at JHU, you will choose one of five levels of review: an Instructional Design Review, a Short-Cycle Evaluation Study, a Case Study, an Efficacy Study, or an Effectiveness Study. Choose the level you’re comfortable with; for even a small investment of a few thousand dollars, you can have the Johns Hopkins seal of approval attached to your company.”

“Instructional Design Review: This is the perfect package for many EIA companies. After successfully completing the review process, your company will be issued a Johns Hopkins University Certificate for Completion of a Successful Design Review. Again, imagine having that ammunition during your next district meeting! Using rubric assessments aligned with instructional design standards and best practices, your products and programs will be reviewed in domains that include the logic of your model, its theoretical framework, your use of evidenced-based strategies, customer analyses, instructional objectives, pedagogy, and delivery/user support. $3,500-$5,000″

“THE FOLLOWING OPTIONS ARE ALSO AVAILABLE FOR LARGER, MORE ESTABLISHED EIA COMPANIES.

Short-Cycle Evaluation Study: These are quick-turnaround “pilots” of products (typically ed-tech based), which use observations, surveys, and interviews with teachers and students in a 10 to 15 week period to determine the potential effectiveness of a product for broader adoption in a school district or group of schools. Educational improvements, adapted to different types of learners, are directly informed by results. This represents a more significant investment, and is geared toward the medium to larger size company within EIA. $10,000-$13,000″

“Case Study: These are small mixed-methods descriptive studies, which are more intensive and rigorous than short-cycle studies. Similar to the latter, they employ observations, interviews, and surveys that focus on educational curricula programs, and services and how they are received and used by target consumers (e.g., teachers, students, parents, etc.). $15,000-$20,000″

“Efficacy Study: This is a medium-scale study that focuses on how programs and educational offerings operate and affect educational outcomes in try-outs in pilot schools or small treatment-control group comparisons. $20,000-$35,000″

“Effectiveness Study: This is a larger-scale “summative evaluation” study that focuses on the success of the program in improving outcomes in rigorous non-randomized (“quasi”) experimental studies or randomized controlled trials. $38,000-up.”

“Again, the first offering – the Instructional Design Review – is the perfect fit for many EIA companies. To get started you only need to do two things: be an EIA member at any level of membership (and if you’re not a member, NOW is the time to join) and then contact me directly to put you in touch with the Johns Hopkins School of Education.”

“The Dean of the JHU School of Education, David Andrews, along with his colleagues, will also be in attendance at this summer’s EDVentures conference in Orlando, July 15 – 17. I encourage you to register for this amazing conference immediately before we are sold out; to do so, please click here to register. I look forward to your future success!

Jim Giovannini
EIA Executive Director
703-938-2429
Jim@educationindustry.org

JHU’s Dr. Steven Ross and Dr. Jennifer Morrison are evaluating the STAT program here in Baltimore County Public Schools.  Ross and Morrison are also presenting at the July 2016 EIA Conference (Demonstrating Product Effectiveness Through Third-Party Evaluations).

You can see Morrison’s CV here where it outlines her involvement in evaluating the STAT program*.  As the EIA website explains, for “even a small investment of a few thousand dollars, you can have the Johns Hopkins seal of approval.”

BCPS is paying $695,000 over 5 years for its STAT evaluation.

One of the members of the EIA Board of Directors is David Andrews, the Dean of the JHU School of Education, although according to January 2016 information from JHU, Andrews was to have left Hopkins on April 1, 2016 to lead National University, the second-largest private nonprofit university in California.

*Please click on Morrison’s “Show complete CV” to show the most recent report completed: Morrison, J. R., Ross, S. M., Cheung, A. C. K., Reid, A. J., & Dusablon, T. (2016) Students and Teachers Accessing Tomorrow: Year two mid-year evaluation report. Report to Baltimore County Public Schools.

When will we get to see the latest report results, BCPS?