Pearson Tested 'Social-Psychological' Messages in Learning Software, With Mixed Results. Benjamin Herold on April 17, 2018, blogs.edweek.org/edweek/DigitalEducation/2018/04/pearson_growth_mindset_software.html
The idea of inserting "social-psychological interventions" into learning software is gaining steam, raising both hopes and fears about the ways the ed-tech industry might seek to capitalize on recent research into the impact of students' mindsets on their learning.
One big new example, presented here today as part of the annual conference of the American Association of Educational Research: AERA Conference Button
Publishing giant Pearson recently conducted an experiment involving more than 9,000 unwitting students at 165 different U.S. colleges and universities. Without seeking prior consent from participating institutions or individuals, the company embedded "growth-mindset" and other psychological messaging into some versions of one of its commercial learning software programs. The company then randomly assigned different colleges to use different versions of that software, tracking whether students who received the messages attempted and completed more problems than their counterparts at other institutions.
The results included some modest signs that some such messaging can increase students' persistence when they start a problem, then run into difficulty. That's likely to bolster growth-mindset proponents, who say it's important to encourage students to view intelligence as something that can change with practice and hard work.
But the bigger takeaway, according to Pearson's AERA paper, is the possibility of leveraging commercial educational software for new research into the emerging science around students' attitudes, beliefs, and ways of thinking about themselves.
"Randomized control trials like this, at scale and embedded into widely used commercial products, are a valuable approach for improving learner outcomes in a rigorous and iterative way, while also contributing to the burgeoning literature on social-psychological interventions," the paper contends.
Concerns Over 'Low-Level Psychological Experimentation'
Outside experts consulted by Education Week offered skeptical reactions to the new Pearson study.
"It does not surprise me at all that corporations are attempting to monetize a promising way of thinking about a hairy problem," said Phi Delta Kappan CEO Joshua Starr, who was a major proponent of social-emotional learning during his time as superintendent of the Montgomery County, Md. school district (and who currently serves on the Aspen Institute's National Commission on Social, Emotional, and Academic Development.)
"There is some value" to Pearson's approach, Starr said, but "social-emotional learning is best promoted through strong communities and relationships."
And Ben Williamson, a lecturer at the University of Stirling in the United Kingdom who studies big data in education, raised other concerns.
There's little evidence that focusing on growth mindset in the classroom will significantly benefit students, Williamson argued, citing recent analyses finding limited effects of mindset-based interventions.
In addition, Williamson maintained, companies such as Pearson would be wise to pay close attention to the growing public anxiety over the ways companies collect people's sensitive information and use it for psychological profiling and targeting. It's especially troubling, he said, that the company did not seek informed consent from the young people who became subjects in their study.
"It's concerning that forms of low-level psychological experimentation to trigger certain behaviors appears to be happening in the ed-tech sector, and students might not know those experiments are taking place," Williamson said.
[...]
Using commercial software allowed Pearson to see how the changes played out for real students and actual classrooms, DiCerbo said, generating more useful information than had it taken place in a lab.
And while the company is considering similar experiments involving other commercial software products used in higher education, she said, Pearson is preparing to selling off its K-12 business, meaning there are likely no short-term implications for those clients.
"We think these motivational aspects are really important for students' learning outcomes," DiCerbo said. "But the only way we're going to know for sure is to do the research."
Mixed Results
The paper presented by Pearson at AERA was titled "Embedding Research-Inspired Innovations in EdTech: An RCT of Social-Psychological Interventions, at Scale."
[...]
[The product] is typically used for introductory computer-science courses [...].
DiCerbo said that made sense as the first content area to test social-psychological messaging, because many students have a propensity to attribute failure in programming to a personal shortcoming, rather than seeing it as a challenge and opportunity to learn.
The idea was to see if students' motivation and achievement would be improved in either of two ways:
. Inserting "growth-mindset" messages (stressing the importance of effort and building skills over time) into the software's instructions and into the feedback it offered to students who provided wrong answers. An example: "No one is born a great programmer. Success takes hours and hours of practice."
. Using "anchoring of effort" messages (seeking to leverage a common cognitive bias in which people tend to rely on the first piece of information they learn, even if it's irrelevant to the problem they're trying to solve.) Pearson's theory here was that students might not have any sense of how much effort is often required to solve computer-programming problems, so providing them with a high-end estimate based on analysis of previous users' experience could ground them in the expectation that multiple attempts would be necessary. An example: "Some students tried this question 26 times! Don't worry if it takes you a few tries to get it right."
The researchers were surprised to learn that students who didn't receive any special messaging from the software attempted to solve significantly more problems (212) than those who received growth-mindset messages (174 problems) or anchoring messages (156 problems.)
That finding suggested that the social-psychological interventions they were testing backfired, although DiCerbo said other factors—especially differences in how various instructors use the software in their classes—may have also played a role.
But the Pearson team also found that students who received the growth-mindset messages successfully completed more of the problems they started than their counterparts. These students were also significantly more likely to eventually solve problems they initially got incorrect, supporting the idea that encouraging a growth mindset can have positive benefits when students run into difficulty.
[...]
"Successfully applying theories like growth mindset is likely to require more precise targeting of specific learners and at specific moments in order to be effective," according to the company's study presented at AERA.
And DiCerbo said efforts to change students' mindsets through learning software are still in their earliest stages.
"It's still an open question as to whether technology is even capably of providing this type of feedback," she said.
No comments:
Post a Comment