Algorithm appreciation: People prefer algorithmic to human judgment. Jennifer M.Logg, Julia A. Minson, Don A. Moore. Organizational Behavior and Human Decision Processes, Volume 151, March 2019, Pages 90-103, https://doi.org/10.1016/j.obhdp.2018.12.005
Highlights
• We challenge prevailing idea that people prefer human to algorithmic judgment.
• In head-to-head comparisons, people use algorithmic advice more than human advice.
• We compare usage of advice using the continuous weighting of advice (WOA) measure.
• People appreciate algorithmic advice despite blindness to algorithm’s process.
• Algorithm appreciation holds even as people underweight advice more generally.
Abstract: Even though computational algorithms often outperform human judgment, received wisdom suggests that people may be skeptical of relying on them (Dawes, 1979). Counter to this notion, results from six experiments show that lay people adhere more to advice when they think it comes from an algorithm than from a person. People showed this effect, what we call algorithm appreciation, when making numeric estimates about a visual stimulus (Experiment 1A) and forecasts about the popularity of songs and romantic attraction (Experiments 1B and 1C). Yet, researchers predicted the opposite result (Experiment 1D). Algorithm appreciation persisted when advice appeared jointly or separately (Experiment 2). However, algorithm appreciation waned when: people chose between an algorithm’s estimate and their own (versus an external advisor’s; Experiment 3) and they had expertise in forecasting (Experiment 4). Paradoxically, experienced professionals, who make forecasts on a regular basis, relied less on algorithmic advice than lay people did, which hurt their accuracy. These results shed light on the important question of when people rely on algorithmic advice over advice from people and have implications for the use of “big data” and algorithmic advice it generates.
No comments:
Post a Comment