S.T.A.T. Year Four Evaluation Report

At the October 23, 2018 Board of Education meeting, S.T.A.T. evaluator Johns Hopkins University’s Center for Research and Reform in Education (CREE) presented the program’s Year Four evaluation. The report’s summary notes:

The impacts of S.T.A.T. on student achievement remain encouraging but still indeterminate given the still relatively short duration of the initiative. Arguably, the primary goal of technology integration is to prepare students for using 21st century learning tools independently and skillfully to increase interest in learning and readiness for postsecondary and career success. Raising performance on standardized achievement tests is also a desirable goal, but one affected by many factors such as core curricula, supplementary educational programing (e.g., after-school, enrichment, and remedial support), school resources, and student characteristics. Importantly, most teachers and principals, particularly those in the most experienced cohorts, continue to hold positive perceptions of the initiative’s impact on CCSS mastery, while acknowledging that measurable impacts on student PARCC or MAP achievement are not yet clear.”

“As the initiative has expanded, so have certain challenges intrinsic to student-centered learning in general and classroom technology integration in particular. When students learn independently and collaboratively, opportunities for students to engage in off-task and disruptive behavior can increase relative to teacher-directed instruction. Recreational activity during class, such as playing games, surfing the Internet, and communicating with peers via cell phones or social media, may prove challenging for teachers, inexperienced in technology integration, to control.”

“Future improvement needs and recommendations include continuing to (a) expand professional development support for teachers on student-centered and P21 instructional practices; and (b) implement strategies to prevent and address student off-task behaviors while using devices, both laptops and cell phones. We also suggest the district revisits the policy allowing students to take devices home each day.”

Read the full report here.

Read the report’s addendum here.

Review the JHU presentation here.

Advertisement

STAT Year Three Year-end Evaluation: UPDATED

Reviewed data and recent evaluations by Johns Hopkins University reveal some slight or statistically insignificant academic gains found at BCPS schools could not be attributed to STAT, since other programs/efforts to increase achievement had also been put into place, principals and outside experts indicated. And “lighthouse” or pilot school comparison figures provided by BCPS for students assigned laptops 1:1 appear to be “cherry picked” as well.

Do such lukewarm outcomes justify the exorbitant costs or support an expansion of a nearly $300 million digital initiative, for the first 6 years alone, and $60 million a year plus millions in digital curricula?

Despite Dr. Dance’s departure, Interim Superintendent Ms. Verletta White says she is committed to STAT.  As she noted in her message to the BCPS community, First-Week Thoughts, July 6, 2017:

“I do want to make clear that we are not changing course or introducing new initiatives.  Our schools are doing well, and technology is a key leverage tool for personalized learning. S.T.A.T. digital learning and Passport elementary world language instruction are just part of how we do business.”

The STAT digital learning initiative kicked off in BCPS in 2014.  Three years later, here are Johns Hopkins University’s Center for Research and Reform in Education’s (CRRE) Year report and slide-show presentation for 2016-17.  Both were presented at the Board of Education’s August 8, 2017 meeting (click on Meetings tab).  Video available here.  STAT report begins at about 1 hour, 11 minutes into the meeting.

Highlights from the 8/8/17 Meeting:

  • JHU researchers: rare to see P21 skills integrated into instruction
  • JHU researchers: education research is biased; in reference to successful 1:1 initiatives, it’s the “survivors” that make it into research studies (and there aren’t many of them)

UPDATED TO INCLUDE INFORMATION FROM 9/14/17 CURRICULUM COMMITTEE MEETING AND 9/26/17 BOARD OF EDUCATION MEETING

The BOE Curriculum Committee discussed STAT at its September 14 meeting; the meeting was recorded and archived.

Comments made at the meeting:

  • BCPS is proud of progress achieved, but recognizes they have work to do.
  • The system is moving in the expected direction at the expected pace.

STAT was also discussed at the September 26, 2017 Board of Education meeting during a REPORT ON STUDENT ACHIEVEMENT – MULTIPLE MEASURES OF PERFORMANCE (https://www.boarddocs.com/mabe/bcps/Board.nsf/Public Meetings tab, select 2017, select 9/26/17 meeting; presentation begins around Minute 1:39)

Moving beyond just MAP and PARCC, BCPS looked at a “constellation” of measures.  KEY POINT made at Minute 1:52 ~ Kindergarten readiness continues to drop; students are coming into the system less prepared (6 out of 10 students).  This is due in part to poverty.  There’s a strong relationship between poverty and student achievement.  We’re confronted with poverty — higher levels in BCPS elementary and middle schools than through the state as a whole — and we’re looking to close achievement gaps over time.

STAT-us BCPS Comment:  BCPS knows that poverty is the most important indicator, yet hundreds (and hundreds) of millions are spent on devices (and everything that goes along with them) to close gaps (and a close review of the BCPS Multiple Measures presentation shows very weak results) instead of addressing poverty and its effects?   What about community schools with wrap-around services?   What about expanding feeding programs?   What about small class sizes, increased support staff, and mentoring programs?

A chart showed that PARCC scores were higher at 10 Lighthouse schools vs. non-LH schools and state schools.  This is misleading; 3 of the 10 LH schools (Fort Garrison, Mays Chapel, and Rodgers Forge) are in economically advantaged areas.  One school, Joppa View, is in a somewhat advantaged area.  Their scores (proportion meeting and/or exceeding CCR) brought up the overall average of schools in economically disadvantaged areas (Chase, Church Lane, Edmondson Heights, Halstead, Hawthorne, and Lansdowne).

Visit schools’ websites to view report cards:

http://www.bcps.org ~ Our Schools ~ School Directory ~ Elementary Schools ~ select school ~ gray box on right has MSDE Report Card for 2016

For example:

Fort Garrison ELA 3 ~ Meeting: 56.8; Exceeding: 25
Fort Garrison ELA 4 ~ Meeting: 36.8; Exceeding: 35.3
Fort Garrison ELA 5 ~ Meeting: 52.2; Exceeding: less than or equal to 5.0
Edmondson Heights ELA 3 ~ Meeting: 6.9; Exceeding: less than or equal to 5.0
Edmondson Heights ELA 4 ~ Meeting: 16.3; Exceeding: less than or equal to 5.0
Edmondson Heights ELA 5 ~ Meeting 9.9; Exceeding: less than or equal to 5.0

Johns Hopkins University Report Highlights:

“An examination of MAP scores in Lighthouse and non-Lighthouse Grades 1 -3 showed some impact on student achievement.  Lighthouse students in Grades 1-2 exhibited improvements in reading and mathematics scores across all three years of implementation and Grade 3 increased reading and mathematics scores in all but the present year.  Further, all grades exceeded the national average mathematics and reading scores. Non-Lighthouse Grades 1-3 also exhibited improvements in reading scores across all three years and, similarly, Grades 1-2 increased mathematics scores. Grades 1-3 exceeded the national.”

“Principals and S.T.A.T. teachers perceived that enhanced teaching practices and stronger curricula were increasing mastery of CCSS.  However, they were generally hesitant to attribute the MAP gains directly or solely to S.T.A.T.  We agree with this assessment for several reasons.  First, gains in achievement were not projected by the Logic Model this early in the implementation, although we cannot rule out more rapidly occurring impacts.  Second, there are numerous programs and initiatives in BCPS, which could contribute to improved student achievement independently of S.T.A.T.”

Lighthouse (LH, pilot) Grade 3 MAP (RIT) Reading Scores:

Pre-program (2013-14): 188.52
Year 1 (2014-15): 194.14
Year 2 (2015-16): 198.37
Year 3 (2016-17): 197.49

Non-LH Grade 3 MAP (RIT) Reading Scores:

Pre-program (2014-15): 194.30
Year 1 (2015-16): 196.30
Year 2 (2016-17): 196.57

STAT-us BCPS NOTE:  While MAP gains are indicated in the report, a close reading shows them to be marginal.  The RIT (Rasch Unit) score reflects a student’s academic knowledge, skills, and abilities.  RIT scores range from 100 to 350.  Additionally, MAP (a growth measure) and PARCC (a proficiency measure) are very different.  As noted on Page 134 of the BCPS FY18 Operating Budget, in FY2016, only 50.2% of third-graders were reading on grade level. Minimal growth in RIT scores is not closing major achievement gaps.

Issues:

~ Off-task device use. “Teachers at all levels described the challenge of monitoring and managing device use during instructional hours, and their comments reflected those when asked to describe off-task/inappropriate use above.”

~ Technical issues. “Some of the technical issues expressed by middle and high school teachers centered on students’ lack of accountability with devices, such as returning to school with a depleted battery, forgetting the device at home, or breaking the devices.  Other technical issues mentioned by teachers at all grade level included slow Internet or BCPSOne not functioning.”

~ Lack of support. Teachers at all levels conveyed feeling overwhelmed and not supported with technology integration.  Some teachers described not having enough time for planning, as noted by a Lighthouse middle school teacher: “TIME!! More planning time is definitely needed!!!”  Others mentioned the challenge of attempting to learn new approaches to instruction along with other initiatives.  A Lighthouse elementary teacher described the struggle of “incorporating the new grading system at the same time as technology,” while another noted, “My greatest challenge is just not taking on too much at one time.  Learning each new innovative ‘thing’ at a time rather than trying to do it all at once.”  Others echoed this sentiment, as a Lighthouse middle school teacher described the challenge of “deciding which resources to use and which to pass on.  There were plenty of resources available but it felt as though I was supposed to utilize as many as I could rather than focusing on/mastering a few. I eventually minimized the resources I utilized.”

Of great concern was JHU researcher Dr. Morrison’s statement, made during her presentation to the BOE, that the initiative was overwhelmingly supported by the community.  This was based on the 2017 Stakeholder Survey, which offered three vague and leading questions regarding personalized learning and technology.  These questions would mean little to community members altogether unaware of STAT and high-school students and parents not even connected to STAT (in 2016-17, the initiative was only in place in four Lighthouse (pilot) high schools).

Personalized Learning:  Parents, school-based staff, and central office staff expressed the most agreement that making learning personalized for students helps teachers meet the academic needs of all students.

Access to Technology:  Agreement was high across students, parents, school-based staff, and central office staff that access to technology increases opportunities for making learning more personalized for students.

Teacher Use of Technology:  Students, parents, school administrators, and central office staff had similarly high levels of agreement that teachers can use technology to meet the academic needs of all students.

Johns Hopkins University’s Mid-Year STAT Evaluation

STAT Year Three Mid-Year Evaluation Presentation by the Johns Hopkins Center for Research and Reform in Education (CRRE), which is working under a 2014-19 $711,000 contractwith the Baltimore County Public School system.

STAT Year Three, Mid-Year Evaluation Report by various CCRE researchers, some of whom also conduct reports for edtech industry companies and organizations; That includes a $80,000 study for DreamBox Learning Math, which is also a BCPS vendor with a nearly $1.2 million contract, set to be expanded. “Co-Principal Investigator (2015 – 2016). Efficacy study of DreamBox Learning Math. DreamBox ($80,000).”

February 16, 2017 Baltimore County Board of Education Curriculum and Instruction Committee meeting at which the evaluation was presented.

During the meeting, which was live streamed (see link above), BOE Member Ann Miller posted on her Facebook page: BCPS Board Member Ann Miller (NOTE:  LH = Lighthouse, the schools where STAT is piloted):

BCPS Board Member Ann Miller
BCPS Board Member Ann Miller Gilliss: What is explanation for performance changes? A: results are not statistically significant. LH schools were slightly more economically disadvantaged. Learning curve. Not enough data to show but encouraging on PARCC compared to state.
BCPS Board Member Ann Miller
BCPS Board Member Ann MillerGilliss: should we expect continued growth as we go forward? A: PARCC we are looking at LH schools in Y2. I would expect performance to be still low for new program. These results aren’t saying STAT was effective in achievement, but does say STAT didn’t interfere in achievement in Y2.

BCPS’ New Grading Policy: Part of the Big Personalized Learning Plan

Bottom Line Up Front (but read to the end for important background information):

Due to an outcry from students and parents (and we hope teachers and administrators behind the scenes), the recently revamped BCPS Grading and Reporting Procedures were amended as of 11/1/16.  Here are the changes and a related article: Towson Flyer: Baltimore County schools amending new grading policy

Grading Policy Amendment

Grading Policy Amendment

These amendments were published directly after the 10/31 forum on the new policy held by BCPS Community Superintendents.  Principals and certain parents were invited to attend and provide feedback.

The Rest of the Story

As noted in our one blog post for the month of June (it was the summer!), the BCPS grading policy underwent a major revision effective 7/1/16.

In early August, schools reached out to parents to explain that in 2014 (when STAT was implemented), a grading and reporting committee made up of parents, teachers, and administrators:

” … reviewed grading and reporting practices from across the state and the nation. Based on the information gathered, the committee determined the policy needed to be rewritten to reflect more current research-based practices to better align your child’s grades with his/her achievement of grade-level standards. To that end, the new Board of Education Policy 5210 Grading and Reporting was approved in June of 2015 for full implementation beginning August, 2016.”

New Policy and Rule 5210

” … all student grades will align to identified course or grade-level standards and be based on a “body of evidence.” A body of evidence is simply the information a teacher collects to determine a student’s level of performance. In addition to making sure grades are based on evidence aligned to standards, (BCPS) wants to ensure that the purpose for assigning grades is clear and consistent across all schools. To do this, BCPS established that the primary purpose for determining marking period grades is to accurately communicate a student’s level of achievement in relation to the course expectations at a given point in time.”

(NOTE: This is key to Mastery-Based Education and computer-delivered curriculum)

“The school system commits to providing equitable, accurate, specific, and timely  information regarding student progress towards course expectations which includes feedback to you and your child in order to guide next steps and indicate areas growth areas. To promote alignment to research-based practices and stakeholder input, the committee oversaw the creation of a procedures manual, which is broken down into six guiding practices:

  1. Grading practices must be supportive of student learning.
  2. Marking-period grades will be based solely on achievement of course grade-level standards.
  3. Students will have multiple opportunities to demonstrate proficiency.
  4. Grades will be based on a body of evidence aligned to standards.
  5. A consistent grading scale will be used to score assignments and assessments.
  6. Accommodations and modifications will be provided for exceptional learners.”

This sounds somewhat reasonable and child-centered in theory, except for the fact that ASCD is all over this Research & Rationale, which makes them suspect:

https://www.bcps.org/academics/grading/researchRationale.html

The Sun wrote an article about it, as did the Towson FlyerDr. Dance felt obliged to write an op-ed in the Sun.  BCPS devoted a website page to it; highlights included a video and a MythBusters List.

The New BCPS Grading and Reporting Policy is Explained

As the school year rolled out, unprepared teachers, parents, and students began to realize what was going on and were not happy.  One parent started a petition to rescind the new grading procedures.  Another parent wrote a must-read op-ed about it: Towson Flyer: What’s Behind BCPS’ New Grading Policy?

National ed-blogger Emily Talmage has written about grading policies like BCPS’:  Is Your District Changing its Grading Policy? Here’s the Real Reason Why.

Take the time to read the Towson Flyer op-ed and Talmage’s piece; you’ll understand why the BCPS Grading and Reporting Policy had to change to enable “anytime, anywhere learning.”

Also read this from iNACOL, the International Association of K-12 Online and Blended Learning.  iNACOL has a baby named Competency Works, which offered a detailed report on grading changes needed for Competency-based Education (STAT).

Any school that has begun the journey toward competency education, breaking free of the limitations of the time-based system, will eventually come face-to-face with grading policies and practices. Along with the excitement of creating a new grading system that ignites a dynamic culture of learning will come opportunities to engage students, families and the community in creating a shared vision about the purpose of school. Challenging the traditional system of grading practices, rooted firmly in the American culture with its exhilarating A+ to the dreadful F, will prompt questions, fears, and misconceptions. There are likely to be lively community meetings and even a letter or two in the local newspaper. There will also be the mutual delight when a competency-based grading system is put into place that allows students and teachers to work together toward a shared vision of learning. Most importantly, there will be cause to celebrate as students make progress toward proficiency.”

Mutual delight?

Advice to BCPS Parents from “Wrench in the Gears” and Why iNACOL Loves ESSA

Recent days have seen an uptick in conversations about online Competency-based Education or CBE, the scary wave of educational transformation rapidly sweeping over the country.  BCPS students, teachers, and parents are at the front edge of this wave with STAT. 

Here is a post by a parent of a public school student who advocates for doing much more than just opting out of end-of-the-year tests.

From Wrench in the Gears (A Skeptical Parent’s Thoughts on Digital Curriculum):  Stop! Don’t opt out. Read this first.

National education expert Diane Ravitch recently linked to the blog.

One of the main “benefits” of our 1:1 initiative, according to Dr. Dance, is that it would allow children to be assessed anytime, anywhere. We’re spending millions on contracts to use and sometimes develop computer-based assessments at the end of every unit.

If you have any doubts about whether the No Child Left Behind (NCLB) replacement, the Every Student Succeeds Act (ESSA), is ripe for computer-based personalized learning assessments, iNACOL, the International Association for K-12 Online Learning, a major trade group, and its partners love ESSA.  Review the slides from this recent webinar hosted by the iNACOL president, iNACOL’s VP for Federal and State Policy, and KnowledgeWorks’ Senior Director of National Policy and you’ll begin to understand why.

During a keynote presentation at iNACOL’s annual meeting, our own Superintendent said:

“The other conversion was this whole idea around the assessment conversion.  There’s a lot of talk around the country about that right now.  Let’s get away from this idea of paper and pencil, you know, multiple-choice assessments.  How do we assess our students without even stopping class, space and time to do that?  Great teachers do this all the time with formative assessments.  But, we also know, in order to personalize learning for young people, we should be able to assess students at any moment, to figure out what level they’re on, what standards they’ve mastered, so they can move along the continuum as [sic] appropriately.”

Watch here. Go to minute 33.

Read, share these links, ask questions, and follow the suggestions from “Wrench in the Gears” that already apply to those of us in BCPS:

~ If your school offers a device for home use, decline to sign the waiver for it and/or pay the fee.

What happens if you don’t sign the waiver for middle and high school?  BCPS needs to make that clear.  We also have elementary students using a 1:1 (that means their own) device at school in first grade!   Many parents are totally unaware how much time students are spending with it, or what they are doing.  Turns out, BCPS leadership doesn’t know how much time students are spending on it either (at approximately 1:00, we hear that there’s “very limited research” on safe screentime in an educational context)!

~ Refuse to allow your child’s behavioral or social-emotional data to be entered into third-party applications. (e.g. Class Dojo)

Ask questions about all the third-party applications being used in BCPS.  Class Dojo tracks behavior.  Check out whether Common Sense Media’s privacy evaluation team has rated the applications. Subscribe to the Parent Coalition for Student Privacy’s blog and check out their back-to-school advice.

~ Refuse in-class social networking programs (e.g. EdModo).

We’re curious about how this is being used in BCPS classrooms and what other social networking software is used.  In general, parents should be very cautious about introducing social media to children – BCPS’ own advice for parents says so.  Parents should have a say about when and how their children are introduced to social networking for school.

~ Set a screentime maximum per day/per week for your child.

Research has shown that when children are spending more than a half-hour per day on the computer, learning outcomes are worse.  The evaluation of STAT thus far has NO data on learning outcomes.  Read the JHU STAT reports here. Ask for homework alternatives that do not require use of a computer.  Ask for textbooks so that reading can be done without more time on the computer.

~ Opt young children out of in-school screentime altogether and request paper and pencil assignments and reading from print books (not e-books).

Parents Across America (PAA), a grassroots, non-partisan organization, has a number of useful linksHere are some questions to ask your school.

~ Begin educating parents about the difference between “personalized learning” modules that rely on mining PII (personally-identifiable information) to function properly and technology that empowers children to create and share their own content.

Dreambox and iReady, so-called “personalized learning” software, are being used in BCPS.  Neither empowers children to create their own content.  See this link on iReady, and this one; this link concerns Dreambox.  Look in BCPSone.  Ask your kids.  Ask your teachers and principals.  What else are they using?  Log in at home with your child if you can and check it out – if you don’t have access to a computer at home, ask your school to show you the programs in action.  You have a right to know what your child is doing at school.

~ Insist that school budgets prioritize human instruction and that hybrid/blended learning not be used as a backdoor way to increase class size or push online classes.

The County Auditor’s report of 2015 notes that class sizes have increased with the implementation STAT.  STAT teachers used to be classroom teachers – they are no longer, instead focusing on professional development.  Hybrid and blended learning have a host of definitions, but here are some examples of how it is playing out so far for kids as young as first grade in BCPS. 

http://lighthouse.bcps.org/reflections/february-26th-2016

http://lighthouse.bcps.org/reflections/flipped-learning-to-differentiate

As Dr. Dance says:

“Most of the nation’s classrooms have about 30 students in them. How can a teacher personalize and customize unless you leverage technology?  In BCPS we have five-year journey to go 1:1 in grades K-12 to where every single kid has a device.” 

But wait.  Respected education policy center NEPC at the University of Colorado says:

“Smaller classes are particularly effective at raising achievement levels of low-income and minority children.”

STAT: Year Two Mid-Year Evaluation Report

“There are three kinds of lies: lies, damn lies, and statistics.”

–Mark Twain

The Year Two Mid-Year Evaluation Report on the Baltimore County Public Schools STAT (Students and Teachers Accessing Tomorrow) initiative was recently released by the Johns Hopkins School of Education’s Center for Research and Reform in Education, and the resulting 69-page document would not disappoint Mr. Twain.

The report, as one might expect from JHU, is clearly written and reasonably thorough, with data parsed, presented, and charted as needed. The report opens by explaining that its purpose is to evaluate “implementations and outcomes” of the STAT program, “…relating to the goals of improving student achievement and preparing globally competitive students” (page 3). However, the very next paragraph clarifies that no, the report “does not examine the achievement of student outcome goals” (3) but rather presents information on the level of professional development offered and a host of “measureable outcomes” (3) from classroom observations.  The report offers nothing about pedagogical effectiveness, the thing that actually improves student achievement. It also leaves many larger questions unanswered, however, providing a scrim of meaningless data to stand in as proof of effectiveness for a pedagogically dubious program.

The information on professional development in the STAT program was gathered through teacher surveys, and the results are obvious: there has been additional and broader professional development opportunities provided to teachers in Lighthouse Schools, and a majority of teachers have taken advantage of those opportunities. It would be foolhardy to roll out a multi-million dollar initiative like STAT without some kind of training, and the report finds that yes, there has been training offered, in large, small, and one-on-one settings. However, the stickier questions are not even asked: what kind of professional development was completed? How effective was it for classroom practice? What were the goals? How were they met?

In the section on measurable outcomes, the results of several classroom observations provided data on classrooms, teacher practice, digital content, student engagement, and P21 skills. Now, just because something can be measured does not make it a valuable metric. Take this example from the classroom environment findings: a “majority of classrooms observed in fall 2015 were physically arranged to support collaborative learning, displayed materials to support independent thinking to some extent, and had materials referencing the general subject or content area being taught” (25).  What is described here is basically a standard classroom; this is expected practice in K-12 environments, as no caring teacher anywhere ever left a drab room of blank walls when working with children.  This so-called “measurable outcome” tells nothing about STAT effectiveness; it’s a good bet that a majority of classrooms were that way before the program even existed. What was interesting in this section, however, was the finding that “students may be less likely to move around the room…considering the availability of information and resources accessed through devices” (25). This is certainly not a positive finding, though proponents of the “just ask Siri” school of research might disagree. What is implied here is that students do not move around much, as they supposedly can get what they need from the screen in front of them. This is not school, this is training for dystopia.

In examining teacher practice, the report found that “nearly all classroom teachers exhibited coaching behavior with students at least occasionally” (28). This is also a measurement of little meaning, as nearly all teachers who work with students in general spend some amount of time in coaching behaviors, teaching behaviors, and other required classroom roles.  Maybe a few might hide behind their desk all day, or perhaps even under it, but these metrics were not included.

Perhaps the most useless metric in the entire report is the one involving digital content. The information was provided by Engrade, the McGraw-Hill property that created the software platform on which BCPS One sits; it is clear they have been logging a copious amount of student data, as they regurgitated some of it for the report to state the obvious: teachers and students in Lighthouse Schools are accessing digital content more frequently. The creation of teacher tiles (program links) for BCPS One increased; teachers in Lighthouse Schools are almost certainly required to be using the platform, so it is little surprise that they have been.  What is surprising is the equating of “student engagement” with “increased student tile views within BCPS one” (47).  Essentially, there have been more teacher and student clicks (of a mouse or browsing button), which tells absolutely nothing about the quality of material that is being clicked upon. Maybe it’s whack-a-mole. Maybe it’s spam. But hey, there’s a lot of clicking going on, so it must be good.

Measuring teacher and student clicks and passing it off as a useful metric is absurd. Clicks tell nothing about quality of materials used or quality of learning outcomes; this is a prime example of being data rich yet content and context poor.

The final section of the evaluation examined the use of P21 skills, which include  “problem solving, project-based approaches to instruction, inquiry-based approaches to instruction, and learning that incorporates authentic/real world contexts.” It is important to stop for a second here and note that these ideas do not need to be branded with the Partnership for 21st Century Learning “P21” moniker. These ideas are not new to the 21st century and stretch back to the truly innovative theories of John Dewey and genuine progressive educational thought (which should not be confused with modern “progressive” education that advocates high-stakes testing and computer-drive personalized learning). The Partnership for 21st Century, a lobbying group for educational technology business interests, has glommed onto these ideas with the hope that they will lend some veracity to their organization. They don’t.

It is interesting to note, however, that the STAT report found that “P21 skills were least frequently observed overall” (42) out of all the metrics examined; perhaps the classroom focus has relied too much on technology and devices, crowding out more pedagogically effective methods such as student collaboration, problem-based learning, and other more engaging practices.

It is also worth noting that for an APA-style document, the year two midyear STAT report does not present a single reference or citation. Perhaps this is by design or request, or perhaps it is because the ideas that underpin the STAT initiative have a poor or nonexistent research base. The report presented a whopping three sentences of recommendations for improvement of the program, to include a focus on professional development “specific to desired teaching and learning activities that are less frequently practiced” (48) and a clarification of the role of the STAT teacher.

This issue of clarification was raised by a section in the report that noted a theme of lack of trust of the STAT teacher from some teacher survey responses. A few survey responses were quoted, which included: “the STAT teacher at our school has become evaluative and administrative in nature. It’s very clear that things shared/things seen in classrooms are shared with administration and hold weight”; “many teachers are concerned as to whether STAT teachers are going back to administrators and telling them about problems in the classroom. Are they judges or mentors?”; “she does not keep confidentiality about what we are working on…I am NOT going to ask for help because it is reported to the principal and spoken about later as a weakness” (22). These comments speak volumes about what is left unsaid by the STAT report—the BCPS administration does not operate the program on a principle of support but rather on one of threat and expected compliance.

Baltimore County Public Schools STAT Evaluation Summary on the BCPS site- you can see the links to the reports at the bottom of the page.

As requested by a reader in the comments, you may also be interested in the complicated relationship between those who wrote the STAT Evaluation Summary and those who pay these same evaluators, otherwise known as conflict of interest:

Johns Hopkins University: Certification for Sale

More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

Are BCPS Students iReady for English-Language Computer Learning?

A retired teacher, Anne Groth, recently took a look at iReady, which is reportedly up for consideration for roll-out in elementary and middle schools next year, pending approval by the Board of Education.  https://teachingafter60.wordpress.com/2016/04/26/are-you-ready-for-iready/#comment-26

She notes that the website claims:  “Research proves that i-Ready can deliver transformational results for all students.”      Read about it here.

WOW.  That’s quite a claim. 

Here’s the “independent” evaluation of this program found on the iReady website by a for-profit company.  The report did not have peer review.  http://www.casamples.com/downloads/ReadyNYEfficacyStudy512.pdf

The problems with this “research” study are many.  First, any research study worth reading includes a limitations section – that is where independent researchers make clear to the reader that they recognize the limitations of their research, and what the limitations mean for the conclusions and implications of the research, and ways of combating these limitations in future research.

In this report, there is no limitations section anywhere.   Instead, these researchers claim “In summary, the study demonstrated unequivocally (emphasis ours) that the use of the Ready program resulted in statistically higher performance on the New York State Tests.”

These researchers don’t understand confounding, or causality.  You cannot compare schools who chose to use a new program to schools who did not, and claim anything about causality “unequivocally.” These folks would be sent back to Biostatistics and Epidemiology 101.

Link to http://www.psychologyinaction.org/2011/10/30/what-is-a-confounding-variable/

This is more marketing than research.

Here’s what another blogger explained about educational research (Recent Dreambox post).

“An interesting US Department of Education resource is the What Works Clearinghouse website (http://ies.ed.gov/ncee/wwc/) established in the early 2000s as a repository for valid research studies on effective educational practices. The site is intended as a “resource for informed decision making” and “identifies studies that provide credible and reliable evidence of the effectiveness of a given practice, program, or policy.” In a time when the word rigor is thrown around by school administrators and edtech companies alike, it is safe to say that the WWC’s standards for vetting research studies are indeed rigorous. There is a “fact check” section to counter the idea that “the WWC never finds evidence of positive effects” in their research reviews. They do…they just only consider research that involves “high-quality evidence” as determined by their very high standards for research design; once a research study is accepted, they find about 70% of them demonstrate some positive effect. The WWC looks at three dimensions of a study to determine validity: methodology employed, data collected, and statistics used. Many studies are deemed ‘not valid’ due to research design issues, narrow interpretations of data, or other flaws.”

There’s nothing in the WWC on iReady, or Curriculum Associates. 

These reviews of iReady are worth a look – written by largely digitally savvy students, who don’t seem too enthusiastic.  http://curriculum-associates.pissedconsumer.com/i-ready-not-20150930708513.html

One more thing – this program is another one that children will do on computers, with headphones, as a solitary activity. While you are actually doing this program, there’s no interaction with your teacher or other students. There’s nothing collaborative about this.  This isn’t building a mechanical flower for earth day.

iReady means time spent on a computer, by yourself, with headphones on, learning, and being tested, via animated characters. As Anne Groth points out, learning to write is inherently collaborative: “The communication between teacher and student, among students, and at home when writing is shared with parents is just something a computer program cannot offer.”  

How much time will students be asked to spend on iReady per week in first grade?  In fourth?  In sixth?

If it is a lot of time, that’s worrisome, as I wonder, what educational practice is being decreased to find the time?

If it isn’t much time, is it worth the cost per student?  We would like to know, what do our teachers really want in the classroom?

More on JHU Researchers and Services; JHU/BCPS and JHU/EIA Connections

This is a follow-up to the other JHU/Education Industry Association blog post from April 10.  The point of both posts?  JHU is being paid to evaluate the efficacy of STAT, but is representing ed-tech vendors vs. looking out for the interests of BCPS students. The evaluation is not rigorous or independent.

In fact, in reading Dr. Morrison’s full CV in the earlier post, one sees the connection to the DreamBox Learning program being used by BCPS to teach elementary school math:

Co-Principal Investigator (2015 – 2016). Efficacy Study of DreamBox Learning Math. DreamBox. Ross, S. M., Principal Investigator.

Morrison, J. R., Ross, S. M., Reilly, J. M., & Cheung, A. C. K. (2016). Retrospective Study of DreamBox Learning Math. Report to DreamBox.

National Blogger Peter Greene of Curmudgucation writes about EIA and JHU working together to better market ed-tech products:  Naked Education Profiteering

This JHU Press Release of March 2012 announces the JHU-EIA Partnership:

“The Johns Hopkins University School of Education and the Education Industry Association today announced a partnership building on their individual strengths in educational instruction and reform.”

“Together, the School of Education and EIA, a trade association representing private providers of education services, will create a center for education innovation and entrepreneurship; facilitate relationships between EIA member companies and the School of Education; integrate for-profit programs, products and concepts more deeply into the education sector; and create joint research and education programs.”

“We strongly believe that our school must develop new programs and partnerships with all components of the education sector in order to achieve our vision of realigning our profession and advancing education reform nationwide,” said David W. Andrews, dean of the School of Education. “Forming this strategic partnership with EIA will help the for-profit and not-for-profit education sectors learn from each other, and better enable us to work together for the betterment of all aspects of education.”

JHU School of Education, Center for Research and Reform in Education’s (CRRE) Dr. Steven Ross (the main STAT evaluator) wrote this article for EIA:  Demonstrating Product Effectiveness:  Is Rigorous Evidence Worth the Rigors?  Here are some highlights from Dr. Ross’ article:

“Because providers strongly believe in what they do, most feel confident that a rigorous evaluation study would present their products in a positive light. The challenge is how to commission and fund such studies. Is striving for the ostensible gold standard– a “randomized controlled trial” (RCT) with a large number of schools, student-level test scores, and all the other trimmings really needed? Such studies are usually quite expensive (think six figures!) to fund. Trying to obtain a federal grant (e.g., “Investing in Innovation” or “i3”) can involve extensive proposal preparations, with steep odds of being selected, and even for the lucky winners, a long wait until the results can be released.”

“My recommendation is to pursue such opportunities where the fit is good and the chances for competing solidly seem strong. But keep in mind that gold-standard studies may actually be “fool’s gold” for many providers. Unless a product is fully developed and delivered in high dosage to students (not as a learning supplement or a support for teachers), it’s quite difficult to show measureable student gains given all the noise (confounding) of so many other classroom, student, and teacher variables. And, as promised above, it seems instructive to take heed of what the district stakeholders said about rigorous evidence in interviews: They rarely read research journals or check out (or even know about) the What Works Clearinghouse (WWC) for research reviews. However, they very much value that a credible third-party evaluator conducted a systematic study of the product being sold. They value evidence of student achievement gains, but with the caveat that the study conditions and the schools involved may be quite different from their own.”

“In our evaluation work with providers, we try to fit the study to the application goals and maturity of the particular product … All of these studies offer the providers potentially useful formative evaluation feedback for program improvement as well as findings from a reasonably rigorous independent study to support and differentiate their products.”

JHU CRRE’s Dr. Ross and Dr. Morrison presented the STAT year-end report at the 7/14/15 BOE meeting (minutes 2:06 to 2:38). Here are the report and evaluation from the BCPS website:

STAT Year-End Evaluation (2014-15)

STAT Year-End Report (2014-15)

Video Highlights from 7/14/15 meeting:

Dr. Ross: “Over time, year two, year three … if things work as they should, you’re gonna be seeing significant improvement in students’ mastery of P21 skills … years 3, years 4 there should be increases in MAP, increases in PARCC …”

BOE Commentary at meeting (paraphrased): the data presented by JHU was a “little lethargic” and, considering the investment in personnel, training, and curriculum based around the digital devices, the BOE expected “to see Dr. Morrison’s bar charts move in the right direction.”

As reported by EdSurge on April 7, 2016, the assets of EIA are being taken over by the Education Technology Industry Network (ETIN), the education division of the Software & Information Industry Association (SIIA).  The article talks of the above-noted JHU-EIA partnership created in 2012.

“Other assets that Billings’ team will inherit from EIA include a partnership with John Hopkins University to support a “joint center for education innovation and entrepreneurship.” EIA has also worked with Digital Promise to publish reports on barriers to technology procurement in K-12 districts.”

The above-mentioned EIA-Digital Promise partnership includes JHU, which wrote a study for them, Fostering Market Efficiency in K-12 Ed-tech Procurement. A key finding is that there “are no readily accessible sources of “rigorous” evidence on the effectiveness of the vast majority of ed-tech products. As a result, school districts largely depend on recommendations from peers and from their own teachers and principals who have familiarity with the products.”

Johns Hopkins University: Certification for Sale

NOTE: This information was found by way of a comment left by Dr. Laura H. Chapman, an educator and education researcher, on the STAT-us BCPS post of March 22.  You can see the comment at the bottom of that post.

DID YOU KNOW that an ed-tech vendor can pay the Johns Hopkins School of Education to certify the efficacy of a product or service?

The Education Industry Association (EIA), which has the strategic goal to “support the role of the private sector in public education” and works to expand business opportunities for education entrepreneurs in PreK-12 markets, has partnered with JHU to offer certifications.

EIA notes that the “vibrant” PreK-12 education industry is “poised for explosive growth … in fact, education is rapidly becoming a $1 trillion industry, second in size only to the healthcare industry, and represents 10 percent of America’s GNP. Federal, state and local expenditures on education exceed $750 billion.”

THIS IS THE EIA HOMEPAGE:

NEW MARKETING STRATEGY!

EIA members can now certify
their services through
Johns Hopkins University!

alt

Johns Hopkins University School of Education is now offering Program Design Reviews for EIA Members!
Dear EIA Members and Potential Members:

Strong entrepreneurial education companies are constantly seeking new ways to market and promote their products and services. Proving the efficacy of your product or service is the single best way to attract new customers, making the “procurement process” much simpler.”

“EIA is now offering an amazing opportunity for its current members and for those wishing to join the Association. Beginning immediately, for a very small investment, EIA members can utilize the services of the Johns Hopkins School of Education (JHU). The team at JHU is offering program design reviews at an extremely discounted rate exclusively for EIA members. There are multiple levels of review your company can participate in, based on your budget and desired review level.”

“I know this might seem a bit intimidating, but, trust me; it is well worth your time and investment. Can you imagine walking into a Superintendent’s office armed with a positive outcome report by none other than the Johns Hopkins School of Education?! Do you think your competitors will have this feather in their cap? The answer is a resounding NO!”

“Picture your new marketing campaign that features your positive outcome with the Johns Hopkins School of Education! And most importantly, imagine what you will learn about your own product or service and the best ways to continually improve in order to produce the best educational outcomes for your students. You actually owe it to yourself, to your investors, and to your students to participate in this incredible opportunity to bring further legitimacy to your company.”

“As you work with the team at JHU, you will choose one of five levels of review: an Instructional Design Review, a Short-Cycle Evaluation Study, a Case Study, an Efficacy Study, or an Effectiveness Study. Choose the level you’re comfortable with; for even a small investment of a few thousand dollars, you can have the Johns Hopkins seal of approval attached to your company.”

“Instructional Design Review: This is the perfect package for many EIA companies. After successfully completing the review process, your company will be issued a Johns Hopkins University Certificate for Completion of a Successful Design Review. Again, imagine having that ammunition during your next district meeting! Using rubric assessments aligned with instructional design standards and best practices, your products and programs will be reviewed in domains that include the logic of your model, its theoretical framework, your use of evidenced-based strategies, customer analyses, instructional objectives, pedagogy, and delivery/user support. $3,500-$5,000″

“THE FOLLOWING OPTIONS ARE ALSO AVAILABLE FOR LARGER, MORE ESTABLISHED EIA COMPANIES.

Short-Cycle Evaluation Study: These are quick-turnaround “pilots” of products (typically ed-tech based), which use observations, surveys, and interviews with teachers and students in a 10 to 15 week period to determine the potential effectiveness of a product for broader adoption in a school district or group of schools. Educational improvements, adapted to different types of learners, are directly informed by results. This represents a more significant investment, and is geared toward the medium to larger size company within EIA. $10,000-$13,000″

“Case Study: These are small mixed-methods descriptive studies, which are more intensive and rigorous than short-cycle studies. Similar to the latter, they employ observations, interviews, and surveys that focus on educational curricula programs, and services and how they are received and used by target consumers (e.g., teachers, students, parents, etc.). $15,000-$20,000″

“Efficacy Study: This is a medium-scale study that focuses on how programs and educational offerings operate and affect educational outcomes in try-outs in pilot schools or small treatment-control group comparisons. $20,000-$35,000″

“Effectiveness Study: This is a larger-scale “summative evaluation” study that focuses on the success of the program in improving outcomes in rigorous non-randomized (“quasi”) experimental studies or randomized controlled trials. $38,000-up.”

“Again, the first offering – the Instructional Design Review – is the perfect fit for many EIA companies. To get started you only need to do two things: be an EIA member at any level of membership (and if you’re not a member, NOW is the time to join) and then contact me directly to put you in touch with the Johns Hopkins School of Education.”

“The Dean of the JHU School of Education, David Andrews, along with his colleagues, will also be in attendance at this summer’s EDVentures conference in Orlando, July 15 – 17. I encourage you to register for this amazing conference immediately before we are sold out; to do so, please click here to register. I look forward to your future success!

Jim Giovannini
EIA Executive Director
703-938-2429
Jim@educationindustry.org

JHU’s Dr. Steven Ross and Dr. Jennifer Morrison are evaluating the STAT program here in Baltimore County Public Schools.  Ross and Morrison are also presenting at the July 2016 EIA Conference (Demonstrating Product Effectiveness Through Third-Party Evaluations).

You can see Morrison’s CV here where it outlines her involvement in evaluating the STAT program*.  As the EIA website explains, for “even a small investment of a few thousand dollars, you can have the Johns Hopkins seal of approval.”

BCPS is paying $695,000 over 5 years for its STAT evaluation.

One of the members of the EIA Board of Directors is David Andrews, the Dean of the JHU School of Education, although according to January 2016 information from JHU, Andrews was to have left Hopkins on April 1, 2016 to lead National University, the second-largest private nonprofit university in California.

*Please click on Morrison’s “Show complete CV” to show the most recent report completed: Morrison, J. R., Ross, S. M., Cheung, A. C. K., Reid, A. J., & Dusablon, T. (2016) Students and Teachers Accessing Tomorrow: Year two mid-year evaluation report. Report to Baltimore County Public Schools.

When will we get to see the latest report results, BCPS?

Letter to County Council Regarding S.T.A.T.

Dear County Council Members,

I am writing on behalf of concerned BCPS parents regarding the newest rounds of BCPS policy involving STAT (especially the leasing of 1:1 devices, the amount of assessment and instructional time spent on devices, and data privacy).

Today, I was listening to a radio broadcast of the hearing involving Governor Rick Snyder from MI and the Flint MI water crisis. One thing is very clear: politicians chose to ignore the warnings of the community who knew something was wrong with their water, and these politicians put money over human health and well-being. While there was data proving problems with the new water source existed, the data were ignored. Meanwhile, decisions to switch water sources were made with NO data proving that switching the water sources was a good thing.

The flood of technology-driven policies being launched in Baltimore County Schools are like lead-based water. BCPS is switching our water from one source to another (water being the parallel for learning). Certain parallels should be made clear to you:

  • We, the community, know there is something fundamentally wrong with the increased push toward technology-based instruction and assessments in lieu of human and collaborative interactions. Yet, our voices are being ignored.
  • There is no data to suggest that moving away from existing models of instruction and assessment and toward (so-called) “personalized” device driven instruction is any better for children.
  • There is ample evidence suggesting that the switch toward more online providers for teaching and learning are driven by economics (saving money for the district and profits for the companies who lobbied for the policies) thus outing money over human health and well-being. The people directly involved with education technology industry and policy are quick to tell you that every child “needs” 21st century skills, that they “need” to be educated more and more via online methods. Yet, they have NO evidence to show this is in fact “necessary.” So ask….WHY? It’s on YOU, the BCPS policy makers to pause and ask yourselves this question.

Because here’s what we DO know. Online device-driven instruction leads to:

  • Increased risks of obesity-increased seat time
  • Reduction of opportunities to engage with multiple learning styles: kinesthetic, social, verbal, environmental…all reduced to visual screen time.
  • Loss of socialization and development of social cuing.

“You can’t learn nonverbal emotional cues from a screen in the way you can learn it from face-to-face communication,” said Yalda Uhls, a senior researcher with UCLA’s Children’s Digital Media Center, in a news release. “If you’re not practicing face-to-face communication, you could be losing important social skills.”

http://www.ctvnews.ca/mobile/health/excessive-screen-time-may-hurt-a-child-s-ability-to-understand-emotions-study-1.1972211

Kids are spending more time than ever in front of screens, and it may be inhibiting their ability to recognize emotions, according to new research out of the University of California, Los Angeles.

http://www.npr.org/sections/ed/2014/08/28/343735856/kids-and-screen-time-what-does-the-research-say

  • Damage to eyes, hands/wrists, and neck.

“Children can develop pain in their fingers and wrists, narrowed blood vessels in their eyes (the long-term consequences of which are unknown), and neck and back pain from being slumped over their phones, tablets and computers.” http://mobile.nytimes.com/blogs/well/2015/07/06/screen-addiction-is-taking-a-toll-on-children/?referer=

  • Loss of data privacy = online platforms delivered to third-party organizations who track every response and behavior your child makes in their learning process. Every bit tracked and monitored and managed. My child is not an unwilling consumer forced to share private information simply because a private company (like Pearson or KIPP) has been made an LEA.
  • Increases ADHD-like symptoms. “Children who are heavy users of electronics may become adept at multitasking, but they can lose the ability to focus on what is most important, a trait critical to the deep thought and problem solving needed for many jobs and other endeavors later in life.” http://mobile.nytimes.com/blogs/well/2015/07/06/screen-addiction-is-taking-a-toll-on-children/?referer=
  • An adrenaline-driven mentality to learning (like addiction). As a practitioner, I observe that many of the children I see suffer from sensory overload, lack of restorative sleep, and a hyper-aroused nervous system, regardless of diagnosis—what I call electronic screen syndrome.These children are impulsive, moody, and can’t pay attention…excessive screen-time appears to impair brain structure and function. Much of the damage occurs in the brain’s frontal lobe, which undergoes massive changes from puberty until the mid-twenties

https://www.psychologytoday.com/blog/mental-wealth/201402/gray-matters-too-much-screen-time-damages-the-brain

So please, as you decide to vote to spend more monies on technology (simply because it seems like the “in” thing or “cool” thing to do because well, “everybody’s doing it”) consider this: Years from now, after learning has been destroyed for a generation of our children because of the lack of thought you put into the decisions you are making for them today, you may find yourselves taking the stand, like Rick Snyder. We, the community will be demanding from you an account for your ignorance and negligence in the face of facts, concerns, and plain common sense which we are presenting to you today. If we learn from anything from history its how not to repeat the same mistakes. Don’t destroy a generation of our children for the sake of politics and profits. Be better than that. Hit the pause button and learn the facts before making decisions that will lead to irreparable harm for our children and our public schools.

Morna McDermott McNulty

BCPS parent and Professor of Education, Towson University