STAT Year Three Year-end Evaluation

Despite Dr. Dance’s departure, Interim Superintendent Ms. Verletta White is committed to STAT.  As she noted in her message to the BCPS community, First-Week Thoughts, July 6, 2017:

“I do want to make clear that we are not changing course or introducing new initiatives. Our schools are doing well, and technology is a key leverage tool for personalized learning. S.T.A.T. digital learning and Passport elementary world language instruction are just part of how we do business.”

The STAT digital learning initiative kicked off in BCPS in 2014.  Three years later, here are Johns Hopkins University’s Center for Research and Reform in Education’s (CRRE) Year report and slide-show presentation for 2016-17.  Both were presented at the Board of Education’s August 8, 2017 meeting (click on Meetings tab).  Video available here.  STAT report begins at about 1 hour, 11 minutes into the meeting.

The BOE Curriculum Committee will discuss STAT at its Thursday, September 14 meeting; the meeting begins at 4:30pm in Building E, Room 114.  Meetings are recorded and archived.

Report Highlights:

“An examination of MAP scores in Lighthouse and non-Lighthouse Grades 1 -3 showed some impact on student achievement. Lighthouse students in Grades 1-2 exhibited improvements in reading and mathematics scores across all three years of implementation and Grade 3 increased reading and mathematics scores in all but the present year. Further, all grades exceeded the national average mathematics and reading scores. Non-Lighthouse Grades 1-3 also exhibited improvements in reading scores across all three years and, similarly, Grades 1-2 increased mathematics scores. Grades 1-3 exceeded the national.”

“Principals and S.T.A.T. teachers perceived that enhanced teaching practices and stronger curricula were increasing mastery of CCSS. However, they were generally hesitant to attribute the MAP gains directly or solely to S.T.A.T. We agree with this assessment for several reasons. First, gains in achievement were not projected by the Logic Model this early in the implementation, although we cannot rule out more rapidly occurring impacts. Second, there are numerous programs and initiatives in BCPS, which could contribute to improved student achievement independently of S.T.A.T.”

Lighthouse (LH, pilot) Grade 3 MAP (RIT) Reading Scores:

Pre-program (2013-14): 188.52

Year 1 (2014-15): 194.14

Year 2 (2015-16): 198.37

Year 3 (2016-17): 197.49

Non-LH Grade 3 MAP (RIT) Reading Scores:

Pre-program (2014-15): 194.30

Year 1 (2015-16): 196.30

Year 2 (2016-17): 196.57

STAT-us BCPS NOTE:  While MAP gains are indicated in the report, a close reading shows them to be marginal. The RIT (Rasch Unit) score reflects a student’s academic knowledge, skills, and abilities. RIT scores range from 100 to 350. Additionally, MAP (a growth measure) and PARCC (a proficiency measure) are very different.  As noted on Page 134 of the BCPS FY18 Operating Budget, in FY2016, only 50.2% of third-graders were reading on grade level. Minimal growth in RIT scores is not closing major achievement gaps.

Issues:

~ Off-task device use. “Teachers at all levels described the challenge of monitoring and managing device use during instructional hours, and their comments reflected those when asked to describe off-task/inappropriate use above.”

~ Technical issues. “Some of the technical issues expressed by middle and high school teachers centered on students’ lack of accountability with devices, such as returning to school with a depleted battery, forgetting the device at home, or breaking the devices. Other technical issues mentioned by teachers at all grade level included slow Internet or BCPSOne not functioning.”

~ Lack of support. Teachers at all levels conveyed feeling overwhelmed and not supported with technology integration. Some teachers described not having enough time for planning, as noted by a Lighthouse middle school teacher: “TIME!! More planning time is definitely needed!!!” Others mentioned the challenge of attempting to learn new approaches to instruction along with other initiatives. A Lighthouse elementary teacher described the struggle of “incorporating the new grading system at the same time as technology,” while another noted, “My greatest challenge is just not taking on too much at one time. Learning each new innovative ‘thing’ at a time rather than trying to do it all at once.” Others echoed this sentiment, as a Lighthouse middle school teacher described the challenge of “deciding which resources to use and which to pass on. There were plenty of resources available but it felt as though I was supposed to utilize as many as I could rather than focusing on/mastering a few. I eventually minimized the resources I utilized.”

Of great concern was JHU researcher Dr. Morrison’s statement, made during her presentation to the BOE, that the initiative was overwhelmingly supported by the community.  This was based on the 2017 Stakeholder Survey, which offered three vague and leading questions regarding personalized learning and technology.  These questions would mean little to community members altogether unaware of STAT and high-school students and parents not even connected to STAT (in 2016-17, the initiative was only in place in four Lighthouse (pilot) high schools).

Personalized Learning:  Parents, school-based staff, and central office staff expressed the most agreement that making learning personalized for students helps teachers meet the academic needs of all students.

Access to Technology:  Agreement was high across students, parents, school-based staff, and central office staff that access to technology increases opportunities for making learning more personalized for students.

Teacher Use of Technology:  Students, parents, school administrators, and central office staff had similarly high levels of agreement that teachers can use technology to meet the academic needs of all students.

Advertisements

Johns Hopkins University’s Mid-Year STAT Evaluation

STAT Year Three Mid-Year Evaluation Presentation by the Johns Hopkins Center for Research and Reform in Education (CRRE), which is working under a 2014-19 $711,000 contractwith the Baltimore County Public School system.

STAT Year Three, Mid-Year Evaluation Report by various CCRE researchers, some of whom also conduct reports for edtech industry companies and organizations; That includes a $80,000 study for DreamBox Learning Math, which is also a BCPS vendor with a nearly $1.2 million contract, set to be expanded. “Co-Principal Investigator (2015 – 2016). Efficacy study of DreamBox Learning Math. DreamBox ($80,000).”

February 16, 2017 Baltimore County Board of Education Curriculum and Instruction Committee meeting at which the evaluation was presented.

During the meeting, which was live streamed (see link above), BOE Member Ann Miller posted on her Facebook page: BCPS Board Member Ann Miller (NOTE:  LH = Lighthouse, the schools where STAT is piloted):

BCPS Board Member Ann Miller
BCPS Board Member Ann Miller Gilliss: What is explanation for performance changes? A: results are not statistically significant. LH schools were slightly more economically disadvantaged. Learning curve. Not enough data to show but encouraging on PARCC compared to state.
BCPS Board Member Ann Miller
BCPS Board Member Ann MillerGilliss: should we expect continued growth as we go forward? A: PARCC we are looking at LH schools in Y2. I would expect performance to be still low for new program. These results aren’t saying STAT was effective in achievement, but does say STAT didn’t interfere in achievement in Y2.

BCPS’ New Grading Policy: Part of the Big Personalized Learning Plan

Bottom Line Up Front (but read to the end for important background information):

Due to an outcry from students and parents (and we hope teachers and administrators behind the scenes), the recently revamped BCPS Grading and Reporting Procedures were amended as of 11/1/16.  Here are the changes and a related article: Towson Flyer: Baltimore County schools amending new grading policy

Grading Policy Amendment

Grading Policy Amendment

These amendments were published directly after the 10/31 forum on the new policy held by BCPS Community Superintendents.  Principals and certain parents were invited to attend and provide feedback.

The Rest of the Story

As noted in our one blog post for the month of June (it was the summer!), the BCPS grading policy underwent a major revision effective 7/1/16.

In early August, schools reached out to parents to explain that in 2014 (when STAT was implemented), a grading and reporting committee made up of parents, teachers, and administrators:

” … reviewed grading and reporting practices from across the state and the nation. Based on the information gathered, the committee determined the policy needed to be rewritten to reflect more current research-based practices to better align your child’s grades with his/her achievement of grade-level standards. To that end, the new Board of Education Policy 5210 Grading and Reporting was approved in June of 2015 for full implementation beginning August, 2016.”

New Policy and Rule 5210

” … all student grades will align to identified course or grade-level standards and be based on a “body of evidence.” A body of evidence is simply the information a teacher collects to determine a student’s level of performance. In addition to making sure grades are based on evidence aligned to standards, (BCPS) wants to ensure that the purpose for assigning grades is clear and consistent across all schools. To do this, BCPS established that the primary purpose for determining marking period grades is to accurately communicate a student’s level of achievement in relation to the course expectations at a given point in time.”

(NOTE: This is key to Mastery-Based Education and computer-delivered curriculum)

“The school system commits to providing equitable, accurate, specific, and timely  information regarding student progress towards course expectations which includes feedback to you and your child in order to guide next steps and indicate areas growth areas. To promote alignment to research-based practices and stakeholder input, the committee oversaw the creation of a procedures manual, which is broken down into six guiding practices:

  1. Grading practices must be supportive of student learning.
  2. Marking-period grades will be based solely on achievement of course grade-level standards.
  3. Students will have multiple opportunities to demonstrate proficiency.
  4. Grades will be based on a body of evidence aligned to standards.
  5. A consistent grading scale will be used to score assignments and assessments.
  6. Accommodations and modifications will be provided for exceptional learners.”

This sounds somewhat reasonable and child-centered in theory, except for the fact that ASCD is all over this Research & Rationale, which makes them suspect:

https://www.bcps.org/academics/grading/researchRationale.html

The Sun wrote an article about it, as did the Towson FlyerDr. Dance felt obliged to write an op-ed in the Sun.  BCPS devoted a website page to it; highlights included a video and a MythBusters List.

The New BCPS Grading and Reporting Policy is Explained

As the school year rolled out, unprepared teachers, parents, and students began to realize what was going on and were not happy.  One parent started a petition to rescind the new grading procedures.  Another parent wrote a must-read op-ed about it: Towson Flyer: What’s Behind BCPS’ New Grading Policy?

National ed-blogger Emily Talmage has written about grading policies like BCPS’:  Is Your District Changing its Grading Policy? Here’s the Real Reason Why.

Take the time to read the Towson Flyer op-ed and Talmage’s piece; you’ll understand why the BCPS Grading and Reporting Policy had to change to enable “anytime, anywhere learning.”

Also read this from iNACOL, the International Association of K-12 Online and Blended Learning.  iNACOL has a baby named Competency Works, which offered a detailed report on grading changes needed for Competency-based Education (STAT).

Any school that has begun the journey toward competency education, breaking free of the limitations of the time-based system, will eventually come face-to-face with grading policies and practices. Along with the excitement of creating a new grading system that ignites a dynamic culture of learning will come opportunities to engage students, families and the community in creating a shared vision about the purpose of school. Challenging the traditional system of grading practices, rooted firmly in the American culture with its exhilarating A+ to the dreadful F, will prompt questions, fears, and misconceptions. There are likely to be lively community meetings and even a letter or two in the local newspaper. There will also be the mutual delight when a competency-based grading system is put into place that allows students and teachers to work together toward a shared vision of learning. Most importantly, there will be cause to celebrate as students make progress toward proficiency.”

Mutual delight?

Advice to BCPS Parents from “Wrench in the Gears” and Why iNACOL Loves ESSA

Recent days have seen an uptick in conversations about online Competency-based Education or CBE, the scary wave of educational transformation rapidly sweeping over the country.  BCPS students, teachers, and parents are at the front edge of this wave with STAT. 

Here is a post by a parent of a public school student who advocates for doing much more than just opting out of end-of-the-year tests.

From Wrench in the Gears (A Skeptical Parent’s Thoughts on Digital Curriculum):  Stop! Don’t opt out. Read this first.

National education expert Diane Ravitch recently linked to the blog.

One of the main “benefits” of our 1:1 initiative, according to Dr. Dance, is that it would allow children to be assessed anytime, anywhere. We’re spending millions on contracts to use and sometimes develop computer-based assessments at the end of every unit.

If you have any doubts about whether the No Child Left Behind (NCLB) replacement, the Every Student Succeeds Act (ESSA), is ripe for computer-based personalized learning assessments, iNACOL, the International Association for K-12 Online Learning, a major trade group, and its partners love ESSA.  Review the slides from this recent webinar hosted by the iNACOL president, iNACOL’s VP for Federal and State Policy, and KnowledgeWorks’ Senior Director of National Policy and you’ll begin to understand why.

During a keynote presentation at iNACOL’s annual meeting, our own Superintendent said:

“The other conversion was this whole idea around the assessment conversion.  There’s a lot of talk around the country about that right now.  Let’s get away from this idea of paper and pencil, you know, multiple-choice assessments.  How do we assess our students without even stopping class, space and time to do that?  Great teachers do this all the time with formative assessments.  But, we also know, in order to personalize learning for young people, we should be able to assess students at any moment, to figure out what level they’re on, what standards they’ve mastered, so they can move along the continuum as [sic] appropriately.”

Watch here. Go to minute 33.

Read, share these links, ask questions, and follow the suggestions from “Wrench in the Gears” that already apply to those of us in BCPS:

~ If your school offers a device for home use, decline to sign the waiver for it and/or pay the fee.

What happens if you don’t sign the waiver for middle and high school?  BCPS needs to make that clear.  We also have elementary students using a 1:1 (that means their own) device at school in first grade!   Many parents are totally unaware how much time students are spending with it, or what they are doing.  Turns out, BCPS leadership doesn’t know how much time students are spending on it either (at approximately 1:00, we hear that there’s “very limited research” on safe screentime in an educational context)!

~ Refuse to allow your child’s behavioral or social-emotional data to be entered into third-party applications. (e.g. Class Dojo)

Ask questions about all the third-party applications being used in BCPS.  Class Dojo tracks behavior.  Check out whether Common Sense Media’s privacy evaluation team has rated the applications. Subscribe to the Parent Coalition for Student Privacy’s blog and check out their back-to-school advice.

~ Refuse in-class social networking programs (e.g. EdModo).

We’re curious about how this is being used in BCPS classrooms and what other social networking software is used.  In general, parents should be very cautious about introducing social media to children – BCPS’ own advice for parents says so.  Parents should have a say about when and how their children are introduced to social networking for school.

~ Set a screentime maximum per day/per week for your child.

Research has shown that when children are spending more than a half-hour per day on the computer, learning outcomes are worse.  The evaluation of STAT thus far has NO data on learning outcomes.  Read the JHU STAT reports here. Ask for homework alternatives that do not require use of a computer.  Ask for textbooks so that reading can be done without more time on the computer.

~ Opt young children out of in-school screentime altogether and request paper and pencil assignments and reading from print books (not e-books).

Parents Across America (PAA), a grassroots, non-partisan organization, has a number of useful linksHere are some questions to ask your school.

~ Begin educating parents about the difference between “personalized learning” modules that rely on mining PII (personally-identifiable information) to function properly and technology that empowers children to create and share their own content.

Dreambox and iReady, so-called “personalized learning” software, are being used in BCPS.  Neither empowers children to create their own content.  See this link on iReady, and this one; this link concerns Dreambox.  Look in BCPSone.  Ask your kids.  Ask your teachers and principals.  What else are they using?  Log in at home with your child if you can and check it out – if you don’t have access to a computer at home, ask your school to show you the programs in action.  You have a right to know what your child is doing at school.

~ Insist that school budgets prioritize human instruction and that hybrid/blended learning not be used as a backdoor way to increase class size or push online classes.

The County Auditor’s report of 2015 notes that class sizes have increased with the implementation STAT.  STAT teachers used to be classroom teachers – they are no longer, instead focusing on professional development.  Hybrid and blended learning have a host of definitions, but here are some examples of how it is playing out so far for kids as young as first grade in BCPS. 

http://lighthouse.bcps.org/reflections/february-26th-2016

http://lighthouse.bcps.org/reflections/flipped-learning-to-differentiate

As Dr. Dance says:

“Most of the nation’s classrooms have about 30 students in them. How can a teacher personalize and customize unless you leverage technology?  In BCPS we have five-year journey to go 1:1 in grades K-12 to where every single kid has a device.” 

But wait.  Respected education policy center NEPC at the University of Colorado says:

“Smaller classes are particularly effective at raising achievement levels of low-income and minority children.”

STAT: Year Two Mid-Year Evaluation Report

“There are three kinds of lies: lies, damn lies, and statistics.”

–Mark Twain

The Year Two Mid-Year Evaluation Report on the Baltimore County Public Schools STAT (Students and Teachers Accessing Tomorrow) initiative was recently released by the Johns Hopkins School of Education’s Center for Research and Reform in Education, and the resulting 69-page document would not disappoint Mr. Twain.

The report, as one might expect from JHU, is clearly written and reasonably thorough, with data parsed, presented, and charted as needed. The report opens by explaining that its purpose is to evaluate “implementations and outcomes” of the STAT program, “…relating to the goals of improving student achievement and preparing globally competitive students” (page 3). However, the very next paragraph clarifies that no, the report “does not examine the achievement of student outcome goals” (3) but rather presents information on the level of professional development offered and a host of “measureable outcomes” (3) from classroom observations.  The report offers nothing about pedagogical effectiveness, the thing that actually improves student achievement. It also leaves many larger questions unanswered, however, providing a scrim of meaningless data to stand in as proof of effectiveness for a pedagogically dubious program.

The information on professional development in the STAT program was gathered through teacher surveys, and the results are obvious: there has been additional and broader professional development opportunities provided to teachers in Lighthouse Schools, and a majority of teachers have taken advantage of those opportunities. It would be foolhardy to roll out a multi-million dollar initiative like STAT without some kind of training, and the report finds that yes, there has been training offered, in large, small, and one-on-one settings. However, the stickier questions are not even asked: what kind of professional development was completed? How effective was it for classroom practice? What were the goals? How were they met?

In the section on measurable outcomes, the results of several classroom observations provided data on classrooms, teacher practice, digital content, student engagement, and P21 skills. Now, just because something can be measured does not make it a valuable metric. Take this example from the classroom environment findings: a “majority of classrooms observed in fall 2015 were physically arranged to support collaborative learning, displayed materials to support independent thinking to some extent, and had materials referencing the general subject or content area being taught” (25).  What is described here is basically a standard classroom; this is expected practice in K-12 environments, as no caring teacher anywhere ever left a drab room of blank walls when working with children.  This so-called “measurable outcome” tells nothing about STAT effectiveness; it’s a good bet that a majority of classrooms were that way before the program even existed. What was interesting in this section, however, was the finding that “students may be less likely to move around the room…considering the availability of information and resources accessed through devices” (25). This is certainly not a positive finding, though proponents of the “just ask Siri” school of research might disagree. What is implied here is that students do not move around much, as they supposedly can get what they need from the screen in front of them. This is not school, this is training for dystopia.

In examining teacher practice, the report found that “nearly all classroom teachers exhibited coaching behavior with students at least occasionally” (28). This is also a measurement of little meaning, as nearly all teachers who work with students in general spend some amount of time in coaching behaviors, teaching behaviors, and other required classroom roles.  Maybe a few might hide behind their desk all day, or perhaps even under it, but these metrics were not included.

Perhaps the most useless metric in the entire report is the one involving digital content. The information was provided by Engrade, the McGraw-Hill property that created the software platform on which BCPS One sits; it is clear they have been logging a copious amount of student data, as they regurgitated some of it for the report to state the obvious: teachers and students in Lighthouse Schools are accessing digital content more frequently. The creation of teacher tiles (program links) for BCPS One increased; teachers in Lighthouse Schools are almost certainly required to be using the platform, so it is little surprise that they have been.  What is surprising is the equating of “student engagement” with “increased student tile views within BCPS one” (47).  Essentially, there have been more teacher and student clicks (of a mouse or browsing button), which tells absolutely nothing about the quality of material that is being clicked upon. Maybe it’s whack-a-mole. Maybe it’s spam. But hey, there’s a lot of clicking going on, so it must be good.

Measuring teacher and student clicks and passing it off as a useful metric is absurd. Clicks tell nothing about quality of materials used or quality of learning outcomes; this is a prime example of being data rich yet content and context poor.

The final section of the evaluation examined the use of P21 skills, which include  “problem solving, project-based approaches to instruction, inquiry-based approaches to instruction, and learning that incorporates authentic/real world contexts.” It is important to stop for a second here and note that these ideas do not need to be branded with the Partnership for 21st Century Learning “P21” moniker. These ideas are not new to the 21st century and stretch back to the truly innovative theories of John Dewey and genuine progressive educational thought (which should not be confused with modern “progressive” education that advocates high-stakes testing and computer-drive personalized learning). The Partnership for 21st Century, a lobbying group for educational technology business interests, has glommed onto these ideas with the hope that they will lend some veracity to their organization. They don’t.

It is interesting to note, however, that the STAT report found that “P21 skills were least frequently observed overall” (42) out of all the metrics examined; perhaps the classroom focus has relied too much on technology and devices, crowding out more pedagogically effective methods such as student collaboration, problem-based learning, and other more engaging practices.

It is also worth noting that for an APA-style document, the year two midyear STAT report does not present a single reference or citation. Perhaps this is by design or request, or perhaps it is because the ideas that underpin the STAT initiative have a poor or nonexistent research base. The report presented a whopping three sentences of recommendations for improvement of the program, to include a focus on professional development “specific to desired teaching and learning activities that are less frequently practiced” (48) and a clarification of the role of the STAT teacher.

This issue of clarification was raised by a section in the report that noted a theme of lack of trust of the STAT teacher from some teacher survey responses. A few survey responses were quoted, which included: “the STAT teacher at our school has become evaluative and administrative in nature. It’s very clear that things shared/things seen in classrooms are shared with administration and hold weight”; “many teachers are concerned as to whether STAT teachers are going back to administrators and telling them about problems in the classroom. Are they judges or mentors?”; “she does not keep confidentiality about what we are working on…I am NOT going to ask for help because it is reported to the principal and spoken about later as a weakness” (22). These comments speak volumes about what is left unsaid by the STAT report—the BCPS administration does not operate the program on a principle of support but rather on one of threat and expected compliance.

Baltimore County Public Schools STAT Evaluation Summary on the BCPS site- you can see the links to the reports at the bottom of the page.

As requested by a reader in the comments, you may also be interested in the complicated relationship between those who wrote the STAT Evaluation Summary and those who pay these same evaluators, otherwise known as conflict of interest:

Johns Hopkins University: Certification for Sale

More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

Are BCPS Students iReady for English-Language Computer Learning?

A retired teacher, Anne Groth, recently took a look at iReady, which is reportedly up for consideration for roll-out in elementary and middle schools next year, pending approval by the Board of Education.  https://teachingafter60.wordpress.com/2016/04/26/are-you-ready-for-iready/#comment-26

She notes that the website claims:  “Research proves that i-Ready can deliver transformational results for all students.”      Read about it here.

WOW.  That’s quite a claim. 

Here’s the “independent” evaluation of this program found on the iReady website by a for-profit company.  The report did not have peer review.  http://www.casamples.com/downloads/ReadyNYEfficacyStudy512.pdf

The problems with this “research” study are many.  First, any research study worth reading includes a limitations section – that is where independent researchers make clear to the reader that they recognize the limitations of their research, and what the limitations mean for the conclusions and implications of the research, and ways of combating these limitations in future research.

In this report, there is no limitations section anywhere.   Instead, these researchers claim “In summary, the study demonstrated unequivocally (emphasis ours) that the use of the Ready program resulted in statistically higher performance on the New York State Tests.”

These researchers don’t understand confounding, or causality.  You cannot compare schools who chose to use a new program to schools who did not, and claim anything about causality “unequivocally.” These folks would be sent back to Biostatistics and Epidemiology 101.

Link to http://www.psychologyinaction.org/2011/10/30/what-is-a-confounding-variable/

This is more marketing than research.

Here’s what another blogger explained about educational research (Recent Dreambox post).

“An interesting US Department of Education resource is the What Works Clearinghouse website (http://ies.ed.gov/ncee/wwc/) established in the early 2000s as a repository for valid research studies on effective educational practices. The site is intended as a “resource for informed decision making” and “identifies studies that provide credible and reliable evidence of the effectiveness of a given practice, program, or policy.” In a time when the word rigor is thrown around by school administrators and edtech companies alike, it is safe to say that the WWC’s standards for vetting research studies are indeed rigorous. There is a “fact check” section to counter the idea that “the WWC never finds evidence of positive effects” in their research reviews. They do…they just only consider research that involves “high-quality evidence” as determined by their very high standards for research design; once a research study is accepted, they find about 70% of them demonstrate some positive effect. The WWC looks at three dimensions of a study to determine validity: methodology employed, data collected, and statistics used. Many studies are deemed ‘not valid’ due to research design issues, narrow interpretations of data, or other flaws.”

There’s nothing in the WWC on iReady, or Curriculum Associates. 

These reviews of iReady are worth a look – written by largely digitally savvy students, who don’t seem too enthusiastic.  http://curriculum-associates.pissedconsumer.com/i-ready-not-20150930708513.html

One more thing – this program is another one that children will do on computers, with headphones, as a solitary activity. While you are actually doing this program, there’s no interaction with your teacher or other students. There’s nothing collaborative about this.  This isn’t building a mechanical flower for earth day.

iReady means time spent on a computer, by yourself, with headphones on, learning, and being tested, via animated characters. As Anne Groth points out, learning to write is inherently collaborative: “The communication between teacher and student, among students, and at home when writing is shared with parents is just something a computer program cannot offer.”  

How much time will students be asked to spend on iReady per week in first grade?  In fourth?  In sixth?

If it is a lot of time, that’s worrisome, as I wonder, what educational practice is being decreased to find the time?

If it isn’t much time, is it worth the cost per student?  We would like to know, what do our teachers really want in the classroom?

More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

This is a follow-up to the other JHU/Education Industry Association blog post from April 10.  The point of both posts?  JHU is being paid to evaluate the efficacy of STAT, but is representing ed-tech vendors vs. looking out for the interests of BCPS students. The evaluation is not rigorous or independent.

In fact, in reading Dr. Morrison’s full CV in the earlier post, one sees the connection to the DreamBox Learning program being used by BCPS to teach elementary school math:

Co-Principal Investigator (2015 – 2016). Efficacy Study of DreamBox Learning Math. DreamBox. Ross, S. M., Principal Investigator.

Morrison, J. R., Ross, S. M., Reilly, J. M., & Cheung, A. C. K. (2016). Retrospective Study of DreamBox Learning Math. Report to DreamBox.

National Blogger Peter Greene of Curmudgucation writes about EIA and JHU working together to better market ed-tech products:  Naked Education Profiteering

This JHU Press Release of March 2012 announces the JHU-EIA Partnership:

“The Johns Hopkins University School of Education and the Education Industry Association today announced a partnership building on their individual strengths in educational instruction and reform.”

“Together, the School of Education and EIA, a trade association representing private providers of education services, will create a center for education innovation and entrepreneurship; facilitate relationships between EIA member companies and the School of Education; integrate for-profit programs, products and concepts more deeply into the education sector; and create joint research and education programs.”

“We strongly believe that our school must develop new programs and partnerships with all components of the education sector in order to achieve our vision of realigning our profession and advancing education reform nationwide,” said David W. Andrews, dean of the School of Education. “Forming this strategic partnership with EIA will help the for-profit and not-for-profit education sectors learn from each other, and better enable us to work together for the betterment of all aspects of education.”

JHU School of Education, Center for Research and Reform in Education’s (CRRE) Dr. Steven Ross (the main STAT evaluator) wrote this article for EIA:  Demonstrating Product Effectiveness:  Is Rigorous Evidence Worth the Rigors?  Here are some highlights from Dr. Ross’ article:

“Because providers strongly believe in what they do, most feel confident that a rigorous evaluation study would present their products in a positive light. The challenge is how to commission and fund such studies. Is striving for the ostensible gold standard– a “randomized controlled trial” (RCT) with a large number of schools, student-level test scores, and all the other trimmings really needed? Such studies are usually quite expensive (think six figures!) to fund. Trying to obtain a federal grant (e.g., “Investing in Innovation” or “i3”) can involve extensive proposal preparations, with steep odds of being selected, and even for the lucky winners, a long wait until the results can be released.”

“My recommendation is to pursue such opportunities where the fit is good and the chances for competing solidly seem strong. But keep in mind that gold-standard studies may actually be “fool’s gold” for many providers. Unless a product is fully developed and delivered in high dosage to students (not as a learning supplement or a support for teachers), it’s quite difficult to show measureable student gains given all the noise (confounding) of so many other classroom, student, and teacher variables. And, as promised above, it seems instructive to take heed of what the district stakeholders said about rigorous evidence in interviews: They rarely read research journals or check out (or even know about) the What Works Clearinghouse (WWC) for research reviews. However, they very much value that a credible third-party evaluator conducted a systematic study of the product being sold. They value evidence of student achievement gains, but with the caveat that the study conditions and the schools involved may be quite different from their own.”

“In our evaluation work with providers, we try to fit the study to the application goals and maturity of the particular product … All of these studies offer the providers potentially useful formative evaluation feedback for program improvement as well as findings from a reasonably rigorous independent study to support and differentiate their products.”

JHU CRRE’s Dr. Ross and Dr. Morrison presented the STAT year-end report at the 7/14/15 BOE meeting (minutes 2:06 to 2:38). Here are the report and evaluation from the BCPS website:

STAT Year-End Evaluation (2014-15)

STAT Year-End Report (2014-15)

Video Highlights from 7/14/15 meeting:

Dr. Ross: “Over time, year two, year three … if things work as they should, you’re gonna be seeing significant improvement in students’ mastery of P21 skills … years 3, years 4 there should be increases in MAP, increases in PARCC …”

BOE Commentary at meeting (paraphrased): the data presented by JHU was a “little lethargic” and, considering the investment in personnel, training, and curriculum based around the digital devices, the BOE expected “to see Dr. Morrison’s bar charts move in the right direction.”

As reported by EdSurge on April 7, 2016, the assets of EIA are being taken over by the Education Technology Industry Network (ETIN), the education division of the Software & Information Industry Association (SIIA).  The article talks of the above-noted JHU-EIA partnership created in 2012.

“Other assets that Billings’ team will inherit from EIA include a partnership with John Hopkins University to support a “joint center for education innovation and entrepreneurship.” EIA has also worked with Digital Promise to publish reports on barriers to technology procurement in K-12 districts.”

The above-mentioned EIA-Digital Promise partnership includes JHU, which wrote a study for them, Fostering Market Efficiency in K-12 Ed-tech Procurement. A key finding is that there “are no readily accessible sources of “rigorous” evidence on the effectiveness of the vast majority of ed-tech products. As a result, school districts largely depend on recommendations from peers and from their own teachers and principals who have familiarity with the products.”