A Parent and Engineer Speaks:
I’m a parent of a BCPS student. I’m an engineer and I also mentor a middle school robotics team, so I’m not averse to kids using computers or learning as much as they can about science and technology. In fact, when I first heard that our students would soon be getting a laptop to use in school, I thought it sounded like a great idea, especially considering how heavy my son’s backpack was beginning to get. At first glance, the STAT program looked like a good idea. But as I took a closer look, I have grown increasingly concerned that we in BCPS are not so much the recipients of some grand proven technology but instead we are being used as guinea pigs for the digital education industry.
When Dr. Dance announced the launch of the STAT program in the Spring of 2013, I wanted to learn as much as I could about the educational technology they planned to use and this new way of teaching they call “student-centered learning”. I wanted to make sure that I could assist not only my own child with his classwork but also help his robotics team utilize this new educational approach as much as possible. I was naturally curious about the body of basic research that led Dr. Dance to make such a bold, not to mention pricey, decision. Whenever faced with an unknown, it’s my practice to look at the fundamentals of something and try to work my way up.
To my dismay, however, I found very little research that actually covered this new approach. On the internet I could easily find hundreds of articles discussing how promising this kind of approach might be but I could find no solid positive examples that really satisfied me. While there were lots of opinions about how awesome something like STAT might be, I could find nothing that linked increased test scores with this kind of system-wide laptop-centered learning.
My concerns were heightened when I read the following article in Education Week:
“While there is much on-going research on new technologies and their effects on teaching and learning, there is little rigorous, large-scale data that makes for solid research, education experts say. The vast majority of the studies available are funded by the very companies and institutions that have created and promoted the technology, raising questions of the research’s validity and objectivity. In addition, the kinds of studies that produce meaningful data often take several years to complete—a timeline that lags far behind the fast pace of emerging and evolving technologies.”
Also, in the same Education Week article: “For example, it is difficult to pinpoint empirical data to support the case for mobile learning in schools—a trend that educators have been exploring for several years now—let alone data to support even newer technologies such as tablet computers like the iPad. The studies that do look at the effects of mobile technologies on learning are often based on small samples of students involved in short-term pilots, not the kind of large-scale, ongoing samples of students that educators and policymakers would like to see.”
Instead of success stories, I found examples of district-wide failures. For example, there is the Education Achievement Authority (EAA) in Detroit, a situation that became so bad for students that the Michigan ACLU stepped in to investigate. http://www.metrotimes.com/detroit/the-eaa-exposed-an-investigative- report/Content?oid=2249513
And in Los Angeles, there was the 1.3 billion dollar iPad fiasco: http://www.wired.com/2015/05/los-angeles-edtech/
Then I came across the National Education Technology Plan 2010 (NETP 2010), issued by the Department of Education in November 2010. This report seemed to be a call to the nation for exactly the kind of technology that the STAT program is promised to be.
From page 78 of the NETP 2010, we read: “What we do not have is an integrated system that can perform all these functions dynamically while optimizing engagement and learning for all learners. Such an integrated system is essential for implementing the individualized, differentiated, and personalized learning called for in this plan.”
From page 80:”…we have yet to see highly effective systems that can be brought to scale. ”
The NETP2010 report implies that at the start of 2011, experts in the Department of Education were aware that there was no viable technology capable of doing exactly what the STAT program now claims to be capable of achieving. And yet it was hardly 2 years before Dr. Dance announced the STAT program. If such a program didn’t exist in 2011, how could Dr. Dance have derived the conviction needed to aggressively drive such an experimental program into our school system?
Perhaps the NETP 2010 provided Dr. Dance with all the grit necessary to plow forward as it calls for a radical, high-risk/high-return approach to educational experimentation involving rapid cycles of trial and error. From NETP 2010 pages 76-77: “… recruit and bring together the best minds and organizations to collaborate on high-risk/high-gain education R&D projects. It should aim for radical, orders-of-magnitude improvements by envisioning the impact of innovations and then working backward to identify the fundamental breakthroughs required to make them possible…..Through the funding of rapid and iterative cycles of design and trial implementation in educational settings, the national center can demonstrate the feasibility and early-stage potential of innovative tools, content, and pedagogies that leverage knowledge, information, and technology advances at the cutting edge.”
In his Transition report of 2012 November, Dr. Dance made no mention of STAT. In fact words such as “digital” or “computer” or “laptop” show up nowhere in his entry plan.
And yet STAT was announced in the Spring of 2013. The White House named Dr. Dance a Connected Educator Champion of Change soon afterwards, in 2013. Do Dr. Dance’s connections with the White House have anything to do with imposing STAT upon our county? How could the STAT program burst onto the Dance floor fully formed in such little time?
The NETP 2010 plan calls for swift action. From page ix: “The National Education Technology Plan 2010 (NETP) calls for revolutionary transformation rather than evolutionary tinkering.”
Even the name of STAT seems to have derived its inspiration from the NETP 2010 report. From page xv: “The Time To Act Is Now. The NETP accepts that we do not have the luxury of time: We must act now and commit to fine-tuning and midcourse corrections as we go.”
Also, from page 3: “Above all, we must accept that we do not have the luxury of time. We must act now and commit to fine-tuning and midcourse corrections as we go. We must learn from other kinds of enterprises that have used technology to improve outcomes and increase productivity.”
So my concern is that STAT is being imposed upon our school system by outside interests, specifically the Department of Education and the many computer hardware and software companies that stand to benefit by digitally transforming education in the United States. And I worry that these outside interests are in a gold rush fever to try out their latest technologies and experimental software packages and to get them to market before anyone else. What these corporate interests require, however, is a large, diverse group of guinea pigs on which to run their countless experiments, shake out the bugs in their software, and optimize their algorithms using human test subjects. And I fear BCPS is now handing over to these corporate interests exactly what they demand: 111,000 guinea pigs otherwise known as our students.
This type of guinea pig scenario is one of the main reasons the Michigan ACLU decided to investigate the digital revolution that took place in Detroit’s EAA. The digital reformation there devolved into a situation in which “…teachers and students were, over the course of two school years, used as whetstones to hone a badly flawed product being pitched as cutting-edge technology.” http://www.metrotimes.com/detroit/the-eaa-exposed-an-investigative- report/Content?oid=2249513
We only need to look at the March 2011 Department of Education’s “Winning the Education Future: The Role of ARPA-ED” to see where such initiatives are probably coming from and where they are likely headed.
While ARPA-ED has not been formally funded, it has been repeatedly called for in the proposed federal budget and, if nothing else, its mere proposal reveals the social philosophy of the people who are operating the Department of Education.
In “Winning the Education Future,” we read the following:
From page 7: “The education sector currently suffers from the lack of directed development. Directed development is a means to fund transformational or game-changing technology that the private sector alone cannot or will not support because of high risk, uncertain returns, or extended time horizons for completion. Federal support for public-private partnerships that are high-risk and high-return can play an important role in education, as it has in other areas.
From page 8: “The National Education Technology Plan 2010 called for ―revolutionary change through technology and noted the power of a DARPA-style approach to research. In September of 2010, the President’s Council of Advisors on Science and Technology explicitly called for the creation of an ARPA-ED to help technology ―play a transformative role in education.” ARPA-ED is aimed at developing the following:
From page 2: “Digital tutors as effective as personal tutors. Researchers have long aspired to develop educational software that is as effective as a personal tutor, one of the grand challenges in the President’s innovation strategy. …
“Courses that improve the more students use them. Internet companies like Netflix and Amazon have devoted significant resources to develop tools that analyze consumer data to identify patterns, tailor results to users’ preferences, and provide a more individualized experience. Researchers are exploring whether similar techniques can be applied to education. …
“Educational software as compelling as the best video game …. The insights from great game designers can and should be applied to develop rich and compelling learning environments for students.”
“Digital tutors” and “Courses that improve the more students use them” are examples of adaptive learning systems, artificial intelligence software that models the student’s mind as the student interacts with the program. Software companies need lots and lots of kids to interact with these algorithms so the software will “learn” how best to teach. While I can imagine that these sort of programs might someday be made effective, I can also easily imagine it might take months or years for that software to “converge” on effective teaching methods. And during those months and years, our kids will be suffering with the countless glitches and experimental dead ends typical of software products that are being tested by start-up companies.
What’s most distressing about this scenario is that the insertion of these types of digital tutors into our school system might require outstanding teachers to stand off to the side (as “guide on the side”) and not interact with the students at all. While I’m sure many excellent teachers feel they will simply “ride out” the STAT initiative and teach students the proven way they have always taught them, the introduction of these digital tutors would require the teacher to do next to nothing as the students remain plugged in and interacting solely with the software. If teachers were to interact with the students, then the digital tutor algorithms would not be able to properly model the student’s mind and, in effect, the teacher would introduce a variable that the software would have a hard time predicting.
Students taking ownership of their learning might just be a clever way of saying that the laptops will take over their teaching. With teachers relegated to “guides on the sides” rather than “sages on the stage”, the software stands alone as the student’s single source of instruction. Therefore the algorithm is free to do its work without outside interference from teachers, parents, or even other students.
It’s possible we already see this kind of situation with a math tutoring program called Dreambox. At Vincent Farms Elementary School, for example, parents were instructed to NOT help their children with their math homework but were, instead, instructed to have the child click on the Dreambox question button. Such an instruction to parents to provide zero assistance to their child is consistent with the kind of digital tutor system called for in the ARPA-ED proposal.
See the link labeled in tiny letters “Dreambox Parent Presentation” here: https://vincentfarmes.bcps.org/for_parents/dream_box_information
Dreambox has been around for years so I’m not saying that Dreambox is bad for kids but it is an example of the kind of technology our kids will be exposed to, and perhaps, unlike Dreambox, such software might show up in BCPS classrooms in its early infancy, to be tested on our kids.
Meanwhile we find ourselves in the middle of a student testing upheaval that makes “before” and “after” performance comparisons nearly impossible. And Dr. Dance has decided to not even look at test data until year 3 of his STAT program. “Regarding any assessment data points, our S.T.A.T. Evaluation logic model clearly states that quantitative measures like MAP and PARCC will not be used in an evaluative manner until Year 3.” https://www.bcps.org/academics/stat/STAT-Eval_FAQ.pdf
And yet, in the absence of any performance data, Dr. Dance has been willing to take his STAT program on the road and advertise it to the world as an unqualified success. Rarely a month goes by without Dr. Dance or one of his staff receiving some kind of award for digital innovation. The incoming stream of congratulations is seemingly never ending. Of course, a closer look at who is handing out the awards often exposes the fact that these “non-profits” are sponsored by corporations that will benefit from sales of educational hardware and software. It is this sort of tangled web of relationships that calls into question the validity and objectivity of all that is happening here in BCPS.
Until we have actual test data, we parents are asked to accept the observational data of an “independent” study of STAT being conducted by the Center for Research and Reform in Education (CRRE) at Johns Hopkins University. These data are highly qualitative and depend very heavily on brief classroom observations, which can be woefully subjective.
The person in charge of this “independent” evaluation of STAT just so happens to be a big fan of the ARPA-ED approach to reforming education. Dr. Robert E. Slavin is the Director for the Center for Research and Reform in Education at Johns Hopkins University
Dr. Robert E. Slavin had this to say about ARPA-ED: “ARPA-ED projects would be risky. Many would fail to come to fruition, or would be found in later evaluations to be ineffective. However, this is the nature of innovation, and if we want to find giant leaps forward, we also have to be ready for a few pratfalls, too.” http://www.huffingtonpost.com/robert-e-slavin/arpa-education_b_2957138.html
And in another article, Dr. Slavin had this to say: “Many groups might try out prototypes and many, perhaps most, might fail. But if just one or just a few programs succeeded in making the world’s most effective Algebra I course, the impact would be dramatic.” http://www.huffingtonpost.com/robert-e-slavin/education- innovation_b_855260.html
Dr. Slavin, again: “In education, ARPA-ED would emulate the structure of DARPA in trying to provide rapid, flexible support for experimentation and innovation, especially applications of cutting-edge technology to enduring educational problems. Like DARPA, ARPA-ED could seek to entice non-traditional bidders to apply. These might include technology companies, entertainment companies, or others willing and able to create and take to scale exciting and innovative applications. Think of Microsoft, Apple, or Disney creating algebra programs, science programs, or beginning reading programs using new or established technologies in new ways.” http://www.huffingtonpost.com/robert-e-slavin/arpa-education_b_2957138.html
Dr. Steven M. Ross, who works at the CRRE and is the principle investigator for the STAT evaluation, while giving a presentation about STAT before the BCPS Board of Education in November of 2014, made it clear that the CRRE researchers perceive our school system to be a giant laboratory.
“We’re ecstatic to be part of it, too. We couldn’t ask for a better laboratory. A real laboratory doing very important work.”
Video time approximately = 01:34:00 http://original.livestream.com/bcpslivetv/video?clipId=pla_af935b8e-d426-4584- 892c-3e4df3038d6f
It concerns me when I hear people speak of our school system as a laboratory, and when people who are close to the independent evaluation of something like STAT see nothing unethical about running “risky” experiments on our children, experiments that might result in “pratfalls,“ whose failures are merely shrugged off as “the nature of innovation.”
Frankly, I’m uneasy about applying a DARPA-like approach to educational reforms. No doubt DARPA has cranked out some amazing technologies since it was founded in 1958, but it has also cranked out far, far more failures. There is a great deal of accepted risk associated with anything DARPA undertakes – that’s their fundamental philosophy, high risk/potentially high reward. But I don’t think it’s an acceptable philosophy to apply to our children. It’s the test pilot mentality – you salute them for their bravery while quietly questioning their sanity. But at least test pilots are aware of what they have signed up for – our county didn’t sign up for test pilot duty and neither did our kids.
Of course, some members of the BCPS board of education don’t seem bothered by the lack of objective data in evaluating STAT. BCPS school board
Chairman Uhlfelder had this to say on 3 February 2015 when decisions were being made concerning the expansion of the STAT program: “I don’t have to wait for a study. I can’t imagine the study is not going to be anything but positive.”
At time approximately = 1:36:20. http://original.livestream.com/bcpslivetv/video?clipId=pla_ae4e8aa9-3004-42c3- 8d3c-e1e93e6a1952&utm_source=lslibrary&utm_medium=ui-thumb
At about the same time the Department of Education was pushing for ARPA-ED, they were also unveiling another program called Digital Promise. In September 2011 Education Secretary Arne Duncan unveiled this initiative. “The center will receive start-up funding from the U.S. Department of Education as well as the Carnegie Corporation of New York and the William and Flora Hewlett Foundation, and will be overseen by a board of ed-tech leaders selected based on Congressional recommendations.”
Digital Promise has presented Dr. Dance with a 2014 Digital Innovation in Learning Awards (“Walk the Walk”), and the 2015 Open Door Policy Award, which “shares what’s working and what’s not with other schools”. http://dilas.org/winners/walk-the-walk/ https://www.bcps.org/news/articles/article8050.html
On their profile page of Dr. Dance, Digital Promise applauds Dance because he “Redesigned Chesapeake High School as a national leader in STEM education, using virtual simulations and gaming to increase student engagement and attendance.”
According to Digital Promise, “Chesapeake High School is the district’s launch point for its Learning in Virtual Environments (LiVE) project, and the school launched one of the nation’s first Virtual Learning Environments (VLE) ….. In lieu of textbooks and lectures, the VLE uses simulation and gaming to teach rigorous standards and allows students to actively pursue their own education. …. BCPS is looking to scale some of the successes at Chesapeake High School to 25 more high schools, including the development of a Virtual High School.”
Despite these awards, however, the historically low and declining SAT scores at Chesapeake High School perhaps paint an unflattering picture: http://reportcard.msde.maryland.gov/college_readiness/SAT/2015_SA T_031574. pdf
Unfortunately, this governmental applause for game-ifying education is not limited to just the Digital Promise organization. In its most recent National Education Technology Plan (NETP 2016), the Department of Education called for the gaming industry to help solve the nation’s education problems. In the 100 page report, the NETP2016 mentions the word “game” at least 77 times. http://tech.ed.gov/files/2015/12/NETP16.pdf
But where is the evidence that game-ifying education can actually help? Or is this just a ploy by the computer gaming industry to get its share of America’s 650 billion dollar education budget?
Another questionable aspect of the STAT program is its practice of so-called “personalized learning”, an educational technique that promises to move the teacher away from being “sage on the stage” to being merely the “guide on the side.” We are told this practice allows the students to “access and create content that best meets their needs.” https://www.bcps.org/academics/stat/STAT-Eval_FAQ.pdf
BCPS points to a RAND study published in November 2015 as evidence for the efficacy of “personalized learning”.
(See: the “Continued Progress” download at http://www.rand.org/pubs/research_reports/RR1365.html)
First of all, this study was not published until just recently, so it could not have been part of Dance’s original decision to launch a program like STAT. More importantly, this RAND study was not performed with “regular” randomly-selected schools: it was performed with schools that were funded by the Bill and Melinda Gates Foundation. And some of the schools were completely new.
From page 3 of the RAND study: “All of the schools received funding from the Gates Foundation, either directly or through intermediary organizations, to implement personalized learning practices as part of at least one of the following three foundation-supported initiatives: Next Generation Learning Challenges (NGLC), Charter School Growth Fund’s Next Generation School Investments, and the Gates Foundation’s Personalized Learning Pilots.”
The “Methods and Limitations” section of the RAND report enlightens us as to why the results of this report are probably not applicable to a school system like BCPS.
From page 6: “Despite the increased interest in personalized learning, the field lacks evidence about its effectiveness. This study is designed to address this need using the most rigorous method that can be applied to the foundation-funded set of schools.”
Also from page 6: “In particular, given the implementation design for the portfolio of personalized learning schools in the study, it was not possible to create randomly assigned treatment and control groups; nor did we have access to data from neighboring schools that might have matched the personalized learning schools.”
And again from page 6: “As new schools, they lack a history of data from before they began implementing personalized learning, which would have enabled other analytic methods for determining achievement effects.”
Page 14 reveals that these schools were not operated like normal schools: “Most schools had extended school days or school years, and the extra time was used primarily for additional instruction or to provide individualized support. “ Also, nowhere in the report do they discuss the possible effects of class sizes on achievement. Where is the data for class size? Oddly, these funded schools also spent more time taking their achievement tests, as noted on page 41.
There were a handful of funded district schools involved in the study, but page 13 tells us the effects of personalized learning on those district type schools were not very impressive. “Although two of the district schools produced significant positive results, this was offset by negative results in three other district schools…”
So maybe the only line in this entire RAND study that is actually relevant to BCPS is that one little fact we already read about in its “Methods and Limitations” section located on page 6: “Despite the increased interest in personalized learning, the field lacks evidence about its effectiveness.”
Maybe Dr. Steven M. Ross, the STAT evaluator from the Johns Hopkins CRRE, summarized it best when he said, “Student-centered learning is very hard to do on your own and we failed for 30 or 40 years to do that.” See the video from the 14 July 2015 BCPS Board of Education meeting, time about = 02:36:00 http://original.livestream.com/bcpslivetv/video?clipId=pla_51cf90c7-2189-4803- af77-28903c16e952&utm_source=lslibrary&utm_medium=ui-thumb
On a totally different note, I’d like to point out that student engagement, while always necessary for learning, is not a sufficient metric for evaluating educational software. By that metric alone, Grand Theft Auto, Call of Duty, and Duck Dynasty would be splendid educational resources. So I think it’s clear that we need metrics deeper than just student engagement.
Also, graduation rates should not be the primary metric for evaluating the efficacy of education initiatives. As was pointed out in a recent New York Times article, “…the number of students earning high school diplomas has risen to historic peaks, yet measures of academic readiness for college or jobs are much lower.” Apparently it is easy for school districts to manipulate graduation rates so the districts appear to be making progress even when true progress is lacking. http://www.nytimes.com/2015/12/27/us/as-graduation-rates-rise-experts-fear- standards-have- – heading&module=first-column-region®ion=top-news&WT .nav=top-news&_r=1
In closing, I’m sorry to say that in observing how STAT has been managed these past 18 months, my BS detector has been triggered more times than I care to count. I’m worried the STAT initiative is very similar to one of those “high- risk/high-gain education R&D projects” called for by the Department of Education’s ARPA-ED proposal. And I’m concerned that instead of being evolutionary, the STAT program will continue to be thrust upon our community by administrators and corporate interests operating out of a “radical” and “revolutionary” mindset. I’m concerned that our entire county has been toe- tagged as an easily-accessible test bed for “high risk/high gain” experimental software aimed at creating “digital tutors”, adaptive teaching systems, and game- ified educational products “as compelling as video games”. I’m afraid that our school administrators, instead of keeping our children’s best interests at heart, have instead been carried away by the circus-barkering of snakeoil salesmen from Silicon Valley and their numerous promises of digital panaceas delivered via laptop.
By itself, the mere fact that our student performance metrics are in a state of flux should be cause for pausing the expansion of the STAT program. We should pause expansion of STAT until we can properly evaluate its strengths and weaknesses with metrics that really matter. Otherwise, I’m afraid that we will not only be building the airplane while we fly it, but we will also be flying it completely blind.
Furthermore, I urge BCPS to adopt some kind of policy that prevents software companies from testing out their software on our children. Any software that is used to deliver significant amounts of instruction should have a substantial track record with clear proof of its efficacy. The STAT program should not be a pipeline between our children’s minds and the product developers who are trying to develop their software on the cheap.
We don’t want our children to suffer the same fate as those of the poor kids in
Detroit’s EAA, who, as the Michigan ACLU investigators put it, were used as “whetstones to hone a badly flawed product being pitched as cutting-edge technology.”
A Professor’s Thoughts:
Baltimore County Public Schools is “transforming the state’s third-largest school system into a fully digital learning environment through a variety of initiatives collectively known as Students and Teachers Accessing Tomorrow, or S.T.A.T. The initiative this [first] year included extensive teacher training, a “Lighthouse Schools” pilot that provided 1:1 digital devices for students in Grades 1-3, and the BCPS One information portal for students, parents, and educators.” (http://www.bcps.org/news/articles/article7967.html, June 2015)
Many parents and teachers who believe there is a role for technology in education have significant concerns about this initiative.
Regarding the evidence behind S.T.A.T. and its ongoing evaluation:
There is limited data available that is relevant to this type of initiative, and much of the data that is available comes from for profit companies or non-profits that are funded by corporate interests. The leading voices in American Education do NOT support this type of initiative.
The Organization for Economic Cooperation and Development report on Students, Computers and Learning states that it is beneficial when children spend up to ½ hour per day on a device at school, more than that is more likely to be harmful than helpful.1 You don’t need your own 1:1 device for this amount of time – you can share. Further, “technology is of little help in bridging the skills divide between advantaged and disadvantaged students,” so Baltimore County Public Schools may in fact be harming children instead of addressing equity by diverting attention and resources away from more effective strategies.1
With the freed up resources, you can have smaller class sizes, more support for hungry or homeless children and many other pressing needs.2 And, you actually can load individualized content for multiple children who share a device, for that ½ hour or so per day.
Baltimore County Public Schools now cites a study by Pane et al. as justification for claiming that personalized learning is beneficial. The report by Pane et al. was published in November 2015; and 90% of the schools were charter schools.
Several of the most trusted voices in American education believe there is no trustworthy data to support an initiative like ours.
In fact, the National Education Policy Center, based at the University of Colorado, wrote a detailed critique of the Pane study, showing that it is impossible to use this study as a valid justification for Baltimore County Public Schools’ S.T.A.T. initiative.
“Broad conclusions about the efficacy of technology-based personalized learning, however, are not warranted by the research. Limitations include a sample of treatment schools that is unrepresentative of the general population of schools, the lack of a threshold in the study for what qualified as implementing “personalized learning” in the treatment schools, and the reality that disruptive strategies such as competency-based progression, which require the largest departures from current practice, were rarely implemented in the studied schools.”
Diane Ravitch, a nationally and internationally respected educator at New York University, recently wrote:
“The Baltimore County Public Schools are embarking on a risky gamble that will put all students online. At present, there is no research base to prove the value of this expensive venture. What we can predict is two nefarious consequences: 1) the computers will be used for ”embedded assessment,” so that students are tested daily or continually without knowing it. Second, the students will be data mined continually, and their personally identifiable information will be available to third parties or subject to hacking.”
What are some of the health concerns?:
Young minds, and hearts, need far more non-screen time than most children get in the 21st century world – educational content or otherwise. Clearly there is a role for technology in education – but the interpretation of available research needs to be nuanced. Effective technology that is recommended by physicians, occupational therapists, and special education teachers etc. should be accessible for all who would benefit. A modest use of technology in schools and for schoolwork, increasing as children grow older, is not likely to be harmful. However, there is much we do not know about the effects of personalized learning and technology on learning and health. Some examples from science:
- When you take away technology from middle schoolers for a week, their ability to read social and emotional cues improves. 3
- When you give young children electronic toys that make noise and flash lights, the grown-ups in the room actually talk to the children less than when they are given less “engaging” toys. Talking with real people less often is bad for verbal, and social development.4
- Video games, even educational ones, change the way our brains develop and work. 5
- This generation is not better at multi-tasking their parents – in fact, science shows that everyone must really focus on just one thing in order to do it well and learn complex concepts. In fact, undergraduates learn complex concepts better if they take notes by hand on pen and paper, than they do if they type them on a tablet.6
- “The results also show no appreciable improvements in student achievement in reading, mathematics or science in the countries that had invested heavily in ICT (information and communication technology) for education. And perhaps the most disappointing finding of the report is that technology is of little help in bridging the skills divide between advantaged and disadvantaged students. Put simply, ensuring that every child attains a baseline level of proficiency in reading and mathematics seems to do more to create equal opportunities in a digital world than can be achieved by expanding or subsidising access to high‑tech devices and services.”
- “Class size is an important determinant of student outcomes, and one that can be directly determined by policy. All else being equal, increasing class sizes will harm student outcomes.
The evidence suggests that increasing class size will harm not only children’s test scores in the short run, but also their long-run human capital formation. Money saved today by increasing class sizes will result in more substantial social and educational costs in the future.
The payoff from class-size reduction is greater for low-income and minority children, while any increases in class size will likely be most harmful to these populations.
Policymakers should carefully weigh the efficacy of class-size policy against other potential uses of funds. While lower class size has a demonstrable cost, it may prove the more cost-effective policy overall.”
- “UCLA scientists found that sixth-graders who went five days without even glancing at a smartphone, television or other digital screen did substantially better at reading human emotions than sixth-graders from the same school who continued to spend hours each day looking at their electronic devices.
‘Many people are looking at the benefits of digital media in education, and not many are looking at the costs,’ said Patricia Greenfield, a distinguished professor of psychology in the UCLA College and senior author of the study. ‘Decreased sensitivity to emotional cues — losing the ability to understand the emotions of other people — is one of the costs. The displacement of in-person social interaction by screen interaction seems to be reducing social skills.’”
Uhls et al. Computers in Human Behavior, Volume 39, October 2014, Pages 387–39, available at http://www.sciencedirect.com/science/article/pii/S0747563214003227
- “There’s simply no evidence that a young child can learn language directly from a toy. It isn’t responsive enough. It isn’t social.” http://www.npr.org/sections/ed/2016/01/11/462264537/the-trouble-with-talking-toys
Sosa A. JAMA Pediatr. Published online December 23, 2015. doi:10.1001/http://archpedi.jamanetwork.com/article.aspx?articleid=2478386available at
- “The group examined the functional magnetic resonance imaging (fMRI) brain scans of 154 14 year old boys and girls. When they compared the brains of frequent gamers (defined as those who played video games more than 9 hours per week( to moderate gamers, they discovered that the first group showed larger volume in the left striatum, a brain area involved in risk and reward processing….. ‘This could explain a potential mechanism that makes people play more,’ says Kuhn. ‘Even when facing losses, the reward center of the brain is activated – suggesting a potential mechanism for non-substance addictions.’
Kuhn et al. Translational Psychiatry (2011) 1, e53; doi:10.1038/http://www.nature.com/tp/journal/v1/n11/full/tp201153a.html, available at
- “As technology allows people to do more tasks at the same time, the myth that we can multitask has never been stronger. But researchers say it’s still a myth — and they have the data to prove it.”
“As tested on a group of undergrads, the research proved that laptop users type almost everything they hear without processing the meaning or devoting much thought to what it is they’re taking notes on. Basically, when you type, all you’re doing is mindlessly transcribing, and that does not require much cognitive activity. When you take notes by hand, however, you obviously can’t write down every single word your professor utters. So you listen, summarize, and list only the key points. Your brain is more engaged in the process of comprehension and so the information processed this way is remembered better.”
“Taking notes on laptops rather than in longhand is increasingly common. Many researchers have suggested that laptop note taking is less effective than longhand note taking for learning. Prior studies have primarily focused on students’ capacity for multitasking and distraction when using laptops. The present research suggests that even when laptops are used solely to take notes, they may still be impairing learning because their use results in shallower processing. In three studies, we found that students who took notes on laptops performed worse on conceptual questions than students who took notes longhand. We show that whereas taking more notes can be beneficial, laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning.”
Mueller and Oppenheimer. Psychological Science June 2014 vol. 25 no. 6 1159-1168; available at http://pss.sagepub.com/content/25/6/1159