Parents, Be Wary of “Digital Learning”

May 7, 2018 by

The psychological experiment that publisher Pearson conducted on unsuspecting college students last year has stirred up a controversy to which parents should pay attention. As we wrote, Pearson embedded “growth mindset” exhortations in software for computer science students to spur them on to better performance. Some students’ software included these messages, some didn’t, but no students were informed and allowed to opt out of the experiment.  The goal was to see if the psychological manipulation affected students’ performance.

Education Week writer Benjamin Herold posed several questions about the nature of this experiment and the failure to obtain consent from the students. Education researcher Justin Reich responded with a defense of Pearson’s conduct, claiming that the experiment was perfectly legitimate and that, in fact, we need more of this type of research.

Asked if the experiment constituted clinical research or merely product improvement, Reich argues that the embedded “growth mindset” messages are indistinguishable from what teachers, publishers, and instructional designers do routinely – “they introduce this variation, examine whether the variation leads to better outcomes, and make adjustments accordingly.” So, supposedly, embedding motivational messages in digital content is no different from a teacher’s tweak to a lesson plan.

Similarly, Reich resists labeling the Pearson experiment as “psychological” research, because “every change in a learning environment is a psychology experiment.” He continues: “Every learning experience makes social, emotional, and psychological changes in students – those dimensions are inseparable, and that’s what learning is.”

This “nothing to see here” mindset obscures the central problem of Pearson’s experiment – that a motivational message delivered through individualized technology, with potentially brain-altering capability, and with resulting psycho-social data that can be collected on the effect of that message, is fundamentally different from encouragement that emanates from a teacher or a textbook.

The U.S. Department of Education (USED) has long touted the “exciting” potential of probing human beings through technology. A USED report notes the mountains of data that students generate when they interact with digital technology (such as the Pearson software) – data that can be used for predictive analytics, forecasting how this student will perform or behave in the future. The report also recommends sharing that data with various other parties – even other government agencies – to help government understand how the student’s mind works.

The Pearson software presumably recorded data about whether and how each student responded to the motivational messages. That information could be used to improve the software, but maybe also to evaluate the student’s mindsets – for assessment and inclusion in sharable algorithms to predict his future behavior.

There’s nothing to stop this from happening. (Indeed, data-sharing for “predictive testing” has long been allowed by FERPA). This type of individualized data, with all its troubling possible uses, cannot be generated by a student’s interaction with teachers or textbooks.

The USED report and another draft report issued at about the same time identified an even more unsettling possibility for educational technology: its potential for changing human mindsets, attitudes, and behavior. (As we mentioned in our previous essay, the Organisation for Economic Cooperation and Development is pushing the same thing with respect to global education.) Both USED reports praise the behavior-modification potential of digital programs such as ClassDojo, which teachers can use much like cattle prods to keep children in line, and video games.

Gaming in the classroom is a particularly disturbing example of how technology can change individuals by changing their brains. “Futurist” researchers such as Dr. Jane McGonigle and Dr. James Gee want to use gaming to push students to mimic, in real life, the virtual behavior they exhibit in the games. Parents might resist the idea that corporations (such as Pearson) or the government should be establishing the paradigms of desired behavior and mindsets and then using technology to surreptitiously push children in that direction.

Which raises the other major issue with the Pearson experiment – that it was conducted without informing the students or obtaining their consent for participation. Again, Reich dismisses the concern by arguing that the manipulative software messages were no different from “every new teacher, new course, new textbook revision, new software update. Students are constantly compelled to be subjected to intentional variation in instruction.”

For this reason, Reich argues, Pearson-like experiments fall under the regulation that allows nonconsensual research “on the effectiveness of or the comparison among instructional techniques, curricula, or classroom management methods.” He doesn’t explain how embedded digital messages fit into any of these three categories. Indeed, Reich signals the weakness of his argument when he says that “at many universities” he would be able to “convince the [Institutional Review Board] that the research appropriately fell under Exemption #1 and I would not be obligated to get informed consent.” Not exactly a proclamation of certainty in his position.

Reich also acknowledges that these rules about human-subject research should be updated for the digital world – but he strongly advocates allowing experimentation with as few restrictions as possible. He appears to be strongly invested in the value of compelling children to participate in research for the supposed good of society.

Oddly, Reich uses NAEP testing as an example of “compelling . . . students to participate in research with the potential to benefit all students.” Putting aside the question whether NAEP benefits either the students who take it or students in general, the key point is that students may decline to participate in NAEP testing (which they should definitely do, since NAEP itself  is now profiling mindsets and attitudes, i.e., personality testing, seemingly in violation of the federal law creating it).

Reich’s attitude is all too common among education researchers – that their vision of the good of the collective justifies dragooning students into experiments to which they or their parents might well object.

Why this aversion to informing students and obtaining consent? Certainly it’s easier not to bother with consent procedures, and the research can be broader based if the subjects aren’t allowed to say no (history suggests how denial of such rights to human research subjects can turn out).

But there may be something else. Gaming guru Gee emphasizes the importance of accomplishing digital mind manipulation on the sly: “We cannot change our society in one fell swoop. Sneak in, move quietly, attack unseen, put away the suit – be a snake.”

Parents, don’t let this happen to your children. Demand answers about what “digital learning” is really doing in the classroom, refuse your child’s participation in NAEP if invited, and make it clear that your child is not a guinea pig.

Source: Parents, Be Wary of “Digital Learning” | Truth in American Education

Print Friendly, PDF & Email

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.