The Democracy of Dating: How Political Affiliations Shape Relationship Formation. Matthew J. Easton and John B. Holbein. Journal of Experimental Political Science, Jul 29 2020. https://doi.org/10.1017/XPS.2020.21
Abstract: How much does politics affect relationship building? Previous experimental studies have come to vastly different conclusions – ranging from null to truly transformative effects. To explore these differences, this study replicates and extends previous research by conducting five survey experiments meant to expand our understanding of how politics does/does not shape the formation of romantic relationships. We find that people, indeed, are influenced by the politics of prospective partners; respondents evaluate those in the political out-group as being less attractive, less dateable, and less worthy of matchmaking efforts. However, these effects are modest in size – falling almost exactly in between previous study estimates. Our results shine light on a literature that has, up until this point, produced a chasm in study results – a vital task given concerns over growing levels of partisan animus in the USA and the rapidly expanding body of research on affective polarization.
Wednesday, July 29, 2020
Dementia Incidence Among US Adults Born 1893-1949: Incidence is lower for those born after the mid-1920s, & this lower incidence is not associated with early-life environment as measured in this study
Association of Demographic and Early-Life Socioeconomic Factors by Birth Cohort With Dementia Incidence Among US Adults Born Between 1893 and 1949. Sarah E. Tom et al. JAMA Netw Open. 2020;3(7):e2011094, July 27 2020, doi:10.1001/jamanetworkopen.2020.11094
Key Points
Question Are dementia incidence trends by birth cohort associated with early-life environment?
Findings In this cohort study of 4277 participants in the Adult Changes in Thought study who were born between 1893 and 1949 and were followed up for up to 20 years (1994-2015), the age- and sex-adjusted dementia incidence was lower among those born during the Great Depression (1929-1939) and the period during World War II and postwar (1940-1949) compared with those born in the period before the Great Depression (1921-1928). The association between birth cohort and dementia incidence remained when accounting for early-life socioeconomic environment, educational level, and late-life vascular risk factors.
Meaning The study’s findings indicate that dementia incidence is lower for individuals born after the mid-1920s compared with those born earlier, and this lower incidence is not associated with early-life environment as measured in this study.
Abstract
Importance Early-life factors may be important for later dementia risk. The association between a more advantaged early-life environment, as reflected through an individual’s height and socioeconomic status indicators, and decreases in dementia incidence by birth cohort is unknown.
Objectives To examine the association of birth cohort and early-life environment with dementia incidence among participants in the Adult Changes in Thought study from 1994 to 2015.
Design, Setting, and Participants This prospective cohort study included 4277 participants from the Adult Changes in Thought study, an ongoing longitudinal population-based study of incident dementia in a random sample of adults 65 years and older who were born between 1893 and 1949 and are members of Kaiser Permanente Washington in the Seattle region. Participants in the present analysis were followed up from 1994 to 2015. At enrollment, all participants were dementia-free and completed a baseline evaluation. Subsequent study visits were held every 2 years until a diagnosis of dementia, death, or withdrawal from the study. Participants were categorized by birth period (defined by historically meaningful events) into 5 cohorts: pre–World War I (1893-1913), World War I and Spanish influenza (1914-1920), pre–Great Depression (1921-1928), Great Depression (1929-1939), and World War II and postwar (1940-1949). Participants’ height, educational level, childhood financial stability, and childhood household density were examined as indicators of early-life environment, and later-life vascular risk factors for dementia were assessed. Cox proportional hazards regression models, adjusted for competing survival risk, were used to analyze data. Data were analyzed from June 1, 2018, to April 29, 2020.
Main Outcomes and Measures Participants completed the Cognitive Abilities Screening Instrument every 2 years to assess global cognition. Those with scores indicative of cognitive impairment completed an evaluation for dementia, with dementia diagnoses determined during consensus conferences using criteria from the Diagnostic and Statistical Manual of Mental Disorders, 4th edition.
Results Among 4277 participants, the mean (SD) age was 74.5 (6.4) years, and 2519 participants (58.9%) were women. The median follow-up was 8 years (interquartile range, 4-12 years), with 730 participants developing dementia over 24 378 person-years. The age-specific dementia incidence was lower for those born in 1929 and later compared with those born earlier. Compared with participants born in the pre–Great Depression years (1921-1928), the age- and sex-adjusted hazard ratio was 0.67 (95% CI, 0.53-0.85) for those born in the Great Depression period (1929-1939) and 0.62 (95% CI, 0.29-1.31) for those born in the World War II and postwar period (1940-1949). Although indicators of a more advantaged early-life environment and higher educational level (college or higher) were associated with a lower incidence of dementia, these variables did not explain the association between birth cohort and dementia incidence, which remained when vascular risk factors were included and were similar by sex.
Conclusions and Relevance Age-specific dementia incidence was lower in participants born after the mid-1920s compared with those born earlier. In this population, the decrease in dementia incidence may reflect societal-level changes or individual differences over the life course rather than early-life environment, as reflected through recalled childhood socioeconomic status and measured height, educational level, and later-life vascular risk.
Key Points
Question Are dementia incidence trends by birth cohort associated with early-life environment?
Findings In this cohort study of 4277 participants in the Adult Changes in Thought study who were born between 1893 and 1949 and were followed up for up to 20 years (1994-2015), the age- and sex-adjusted dementia incidence was lower among those born during the Great Depression (1929-1939) and the period during World War II and postwar (1940-1949) compared with those born in the period before the Great Depression (1921-1928). The association between birth cohort and dementia incidence remained when accounting for early-life socioeconomic environment, educational level, and late-life vascular risk factors.
Meaning The study’s findings indicate that dementia incidence is lower for individuals born after the mid-1920s compared with those born earlier, and this lower incidence is not associated with early-life environment as measured in this study.
Abstract
Importance Early-life factors may be important for later dementia risk. The association between a more advantaged early-life environment, as reflected through an individual’s height and socioeconomic status indicators, and decreases in dementia incidence by birth cohort is unknown.
Objectives To examine the association of birth cohort and early-life environment with dementia incidence among participants in the Adult Changes in Thought study from 1994 to 2015.
Design, Setting, and Participants This prospective cohort study included 4277 participants from the Adult Changes in Thought study, an ongoing longitudinal population-based study of incident dementia in a random sample of adults 65 years and older who were born between 1893 and 1949 and are members of Kaiser Permanente Washington in the Seattle region. Participants in the present analysis were followed up from 1994 to 2015. At enrollment, all participants were dementia-free and completed a baseline evaluation. Subsequent study visits were held every 2 years until a diagnosis of dementia, death, or withdrawal from the study. Participants were categorized by birth period (defined by historically meaningful events) into 5 cohorts: pre–World War I (1893-1913), World War I and Spanish influenza (1914-1920), pre–Great Depression (1921-1928), Great Depression (1929-1939), and World War II and postwar (1940-1949). Participants’ height, educational level, childhood financial stability, and childhood household density were examined as indicators of early-life environment, and later-life vascular risk factors for dementia were assessed. Cox proportional hazards regression models, adjusted for competing survival risk, were used to analyze data. Data were analyzed from June 1, 2018, to April 29, 2020.
Main Outcomes and Measures Participants completed the Cognitive Abilities Screening Instrument every 2 years to assess global cognition. Those with scores indicative of cognitive impairment completed an evaluation for dementia, with dementia diagnoses determined during consensus conferences using criteria from the Diagnostic and Statistical Manual of Mental Disorders, 4th edition.
Results Among 4277 participants, the mean (SD) age was 74.5 (6.4) years, and 2519 participants (58.9%) were women. The median follow-up was 8 years (interquartile range, 4-12 years), with 730 participants developing dementia over 24 378 person-years. The age-specific dementia incidence was lower for those born in 1929 and later compared with those born earlier. Compared with participants born in the pre–Great Depression years (1921-1928), the age- and sex-adjusted hazard ratio was 0.67 (95% CI, 0.53-0.85) for those born in the Great Depression period (1929-1939) and 0.62 (95% CI, 0.29-1.31) for those born in the World War II and postwar period (1940-1949). Although indicators of a more advantaged early-life environment and higher educational level (college or higher) were associated with a lower incidence of dementia, these variables did not explain the association between birth cohort and dementia incidence, which remained when vascular risk factors were included and were similar by sex.
Conclusions and Relevance Age-specific dementia incidence was lower in participants born after the mid-1920s compared with those born earlier. In this population, the decrease in dementia incidence may reflect societal-level changes or individual differences over the life course rather than early-life environment, as reflected through recalled childhood socioeconomic status and measured height, educational level, and later-life vascular risk.
Discussion
Among those born at the turn of the 20th century through the mid-20th century who participated in the ACT study, the age-specific dementia incidence was lower for participants born in 1929 and later compared with those born earlier. This trend was not explained by recalled childhood socioeconomic status and measured height, which reflect early-life environment, nor was it explained by educational level and vascular risk as an older adult. The literature on secular dementia trends reports a decrease in dementia incidence starting in the 1990s.1-5 This timing is consistent with participants in the 1929 to 1939 birth cohorts who are entering the eighth decade of life, when dementia risk increases.2,4,31 Political and economic changes during the first half of the 20th century may have had different implications for dementia risk based on the participant’s age during those experiences.32 Analysis by birth cohort captures this intersection of age and calendar time. Our results suggest that societal-level changes in the first half of the 20th century that were not captured by the individual early-life measures or the educational levels used in this study may have been associated with decreases in dementia incidence.
The 40% decrease in the US mortality rate from 1900 to 1940 was likely owing to the decrease in infectious diseases,33 which disproportionately occur in the young. The decrease in dementia incidence observed in the ACT study began with birth cohorts who were born in the middle of this period. These early-life health gains may be factors in the decreased dementia incidence. Although we accounted for family-level socioeconomic status variables and height, these variables may not have captured all changes, such as economic innovation13 and nutritional improvement,12 that may have been associated with decreases in mortality. In addition, variables included in this study may not have captured public heath improvements during this period.33 It is possible that unmeasured differences were more important for assessing dementia risk by birth cohort than the socioeconomic factors we measured.
Across birth cohorts, participants with lower financial status and greater household density in childhood had a lower risk of developing dementia, which is inconsistent with our hypothesis and the results of previous studies.34,35 While the Great Depression was a time of financial hardship, those in the pre–Great Depression and the World War I and Spanish influenza birth cohorts were the least likely to report the ability to afford both basic needs and small luxuries, and they had the smallest proportion of participants reporting the most stable childhood financial quartile. This pattern may reflect problems with measurement or sample selection. Participant responses may reflect experiences in later childhood and early adolescence, as recall of early-life experiences may be difficult. In contrast, parental educational levels, which were constant throughout childhood and adolescence for most of the birth cohorts, were higher for the World War I and Spanish influenza cohort and the pre–Great Depression cohort compared with cohorts born earlier. This pattern suggests a higher early-life standard of living in the more recent birth cohorts. Another possibility is that because these 2 birth cohorts were the oldest, those who survived to participate in the study were able to compensate for adverse early-life environments or had less accurate recall than younger participants.
Our study considered death as a competing risk, while a previous case-control study did not.34,35 Most ACT participants were members of Kaiser Permanente Washington (formerly Group Health) when they were younger than 65 years, during which they primarily received health insurance through large employers. It is likely that those with lower financial status and higher household density during childhood survived adverse experiences to be able to participate the sample.
Together with height, an individual’s parental educational level, childhood financial stability, and childhood household density are likely to reflect their early-life environment. These variables did not explain the decrease in dementia incidence among the more recent birth cohorts. In a minimally adjusted model, the decrease in dementia incidence began with the Great Depression birth cohort, suggesting that societal-level experiences during later childhood to adolescence may have been more important than those during the in-utero through early childhood phase. If this earliest stage of life were important for dementia incidence, we would expect those born in the Great Depression cohort to have the greatest dementia risk. The largest difference in college completion was found between the pre–Great Depression and Great Depression birth cohorts. This disruption to economic opportunity for those born in the pre–Great Depression years may have had implications for dementia risk. The inclusion of late-life vascular risk factors did not appreciably alter the association between a more recent birth cohort and a lower incidence of dementia, which is consistent with analyses of the Einstein Aging Cohort10 and the Framingham Heart Study, which considered the cohort of study entry.1
We found similar associations between birth cohort and decreased dementia incidence in 2 previous studies. An analysis of the English Longitudinal Study of Aging examined 2 birth cohorts based on birth-year median (1902-1925 and 1926-1943),9 and an analysis of the Einstein Aging Study used a data-focused approach to detect a changing point in continuous birth years.10 Our birth cohort categories were based on historically meaningful events. Because the ACT study is larger than the Einstein Aging Study, we were able to separate participants born after 1928 into 2 groups. In the ACT study, the most recent birth cohort (1940–1949) had higher educational levels and childhood financial stability compared with cohorts born earlier. Such categorization also allowed for the separation of worldwide economic disruption from family-level financial stability.
Our analysis may not have captured differences in adult social experiences. Educational level is associated with subsequent occupation and employment patterns. However, birth cohort may reflect experience of events during the 20th century that had broad implications, regardless of educational level. For example, men born in the first 2 decades of the 20th century are likely to have served in the armed forces during World War II and to have benefitted from the GI bill. Men and women from those birth cohorts would also have benefitted from the postwar economic expansion. Our analysis did not capture such adult experiences.
Limitations
Our study has several limitations. Participants in older cohorts necessarily had to survive longer to be included in the study. Because the greatest risk factor for dementia is age, the requirement of survival among the pre–World War I and World War I and Spanish influenza birth cohorts as a requirement to enter the ACT study may create differences in dementia risk that are difficult to detect in these groups. Our results suggest that the most recent birth cohorts may continue to experience lower age-specific dementia incidence. However, follow-up period is shorter in these birth cohorts. The ACT study participants are from 1 health system in the Pacific Northwest, and their educational level is high. The cohort is a random sample of age-eligible members of Kaiser Permanente Washington; results therefore reflect this specific population but may not be generalizable to the US population. Our results are consistent with a sample from the Bronx, New York,10 and a nationally representative sample from the United Kingdom,9 suggesting that the decrease in dementia incidence by birth cohort may be a widespread phenomenon. Because ACT study participants may be socioeconomically advantaged, the measures of early-life environment included in this study may not be sensitive enough to detect meaningful differences that have implications for dementia incidence trends by birth cohort.
The study did not include key health variables from later in the life course that are associated with dementia risk, notably midlife hypertension, hearing loss, late-life depression, diabetes, physical inactivity, and social isolation.6 As the ACT study is currently collecting data on most of these variables, future studies will be able to more fully capture life-course dementia risk factors. As a long-standing study, the follow-up included substantial age overlap of multiple birth cohorts, which had been a limitation in previous studies.9 Dementia diagnosis procedures have been consistent throughout the study. The large size of the ACT study and the theoretical basis of the cohort groups allowed for the inclusion of 2 cohort groups born after 1928 that aligned with historically meaningful events, whereas previous studies have considered only 1 group born after the mid-1920s.9,10
Dementia incidence has decreased in more recent birth cohorts. Our measures of early-life socioeconomic status and educational level do not account for these differences in this study population. Birth cohort may reflect other historical and social changes that occurred during childhood or adulthood.
Self-control is associated with numerous positive outcomes, such as well-being; we argue that hedonic goal pursuit is equally important, & conflicting long-term goals can undermine it in the form of intrusive thoughts
Beyond Self-Control: Mechanisms of Hedonic Goal Pursuit and Its Relevance for Well-Being. Katharina Bernecker, Daniela Becker. Personality and Social Psychology Bulletin, July 26, 2020. https://doi.org/10.1177/0146167220941998
Abstract: Self-control helps to align behavior with long-term goals (e.g., exercising to stay fit) and shield it from conflicting hedonic goals (e.g., relaxing). Decades of research have shown that self-control is associated with numerous positive outcomes, such as well-being. In the present article, we argue that hedonic goal pursuit is equally important for well-being, and that conflicting long-term goals can undermine it in the form of intrusive thoughts. In Study 1, we developed a measure of trait hedonic capacity, which captures people’s success in hedonic goal pursuit and the occurrence of intrusive thoughts. In Studies 2A and 2B, people’s trait hedonic capacity relates positively to well-being. Study 3 confirms intrusive thoughts as major impeding mechanism of hedonic success. Studies 4 and 5 demonstrate that trait hedonic capacity predicts successful hedonic goal pursuit in everyday life. We conclude that hedonic goal pursuit represents a largely neglected but adaptive aspect of self-regulation.
Keywords: hedonic goals, self-control, self-regulation, well-being
Popular version: Hedonism Leads to Happiness. Zurich Univ. Press Release, Jul 27 2020. https://www.media.uzh.ch/en/Press-Releases/2020/Hedonism.html
Abstract: Self-control helps to align behavior with long-term goals (e.g., exercising to stay fit) and shield it from conflicting hedonic goals (e.g., relaxing). Decades of research have shown that self-control is associated with numerous positive outcomes, such as well-being. In the present article, we argue that hedonic goal pursuit is equally important for well-being, and that conflicting long-term goals can undermine it in the form of intrusive thoughts. In Study 1, we developed a measure of trait hedonic capacity, which captures people’s success in hedonic goal pursuit and the occurrence of intrusive thoughts. In Studies 2A and 2B, people’s trait hedonic capacity relates positively to well-being. Study 3 confirms intrusive thoughts as major impeding mechanism of hedonic success. Studies 4 and 5 demonstrate that trait hedonic capacity predicts successful hedonic goal pursuit in everyday life. We conclude that hedonic goal pursuit represents a largely neglected but adaptive aspect of self-regulation.
Keywords: hedonic goals, self-control, self-regulation, well-being
Popular version: Hedonism Leads to Happiness. Zurich Univ. Press Release, Jul 27 2020. https://www.media.uzh.ch/en/Press-Releases/2020/Hedonism.html
The inherent difficulty in accurately appreciating the engaging aspect of thinking activity could explain why people prefer keeping themselves busy, rather than taking a moment for reflection & imagination
Hatano, Aya, Cansu Ogulmus, Hiroaki Shigemasu, and Kou Murayama. 2020. “Thinking About Thinking: People Underestimate Intrinsically Motivating Experiences of Waiting.” PsyArXiv. July 29. doi:10.31234/osf.io/n2ctk
Abstract: The ability to engage in internal thoughts without external stimulation is one of the hallmarks of unique characteristics in humans. The current research tested the hypothesis that people metacognitively underestimate their capability to positively engage in just thinking. Participants were asked to sit and wait in a quiet room without doing anything for a certain amount of time (e.g., 20 min). Before the waiting task, they made a prediction about how intrinsically motivating the task would be at the end of the task; they also rated their experienced intrinsic motivation after the task. Across six experiments we consistently found that participants’ predicted intrinsic motivation for the waiting task was significantly less than experienced intrinsic motivation. This underestimation effect was robustly observed regardless of the independence of predictive rating, the amount of sensory input, duration of the waiting task, timing of assessment, and cultural contexts of participants. This underappreciation of just thinking also led participants to proactively avoid the waiting task when there was an alternative task (i.e. internet news checking), despite that their experienced intrinsic motivation was actually not statistically different. These results suggest the inherent difficulty in accurately appreciating the engaging aspect of thinking activity, and could explain why people prefer keeping themselves busy, rather than taking a moment for reflection and imagination, in our daily life.
Abstract: The ability to engage in internal thoughts without external stimulation is one of the hallmarks of unique characteristics in humans. The current research tested the hypothesis that people metacognitively underestimate their capability to positively engage in just thinking. Participants were asked to sit and wait in a quiet room without doing anything for a certain amount of time (e.g., 20 min). Before the waiting task, they made a prediction about how intrinsically motivating the task would be at the end of the task; they also rated their experienced intrinsic motivation after the task. Across six experiments we consistently found that participants’ predicted intrinsic motivation for the waiting task was significantly less than experienced intrinsic motivation. This underestimation effect was robustly observed regardless of the independence of predictive rating, the amount of sensory input, duration of the waiting task, timing of assessment, and cultural contexts of participants. This underappreciation of just thinking also led participants to proactively avoid the waiting task when there was an alternative task (i.e. internet news checking), despite that their experienced intrinsic motivation was actually not statistically different. These results suggest the inherent difficulty in accurately appreciating the engaging aspect of thinking activity, and could explain why people prefer keeping themselves busy, rather than taking a moment for reflection and imagination, in our daily life.
Gender differences in the trade-off between objective equality and efficiency: The results show that females prefer objective equality over efficiency to a greater extent than males do
Gender differences in the trade-off between objective equality and efficiency. Valerio Capraro. Judgment and Decision Making, Vol. 15, No. 4, July 2020, pp. 534–544. http://journal.sjdm.org/19/190510/jdm190510.pdf
Abstract: Generations of social scientists have explored whether males and females act differently in domains involving competition, risk taking, cooperation, altruism, honesty, as well as many others. Yet, little is known about gender differences in the trade-off between objective equality (i.e., equality of outcomes) and efficiency. It has been suggested that females are more equal than males, but the empirical evidence is relatively weak. This gap is particularly important, because people in power of redistributing resources often face a conflict between equality and efficiency. The recently introduced Trade-Off Game (TOG) – in which a decision-maker has to unilaterally choose between being equal or being efficient – offers a unique opportunity to fill this gap. To this end, I analyse gender differences on a large dataset including N=6,955 TOG decisions. The results show that females prefer objective equality over efficiency to a greater extent than males do. The effect turns out to be particularly strong when the TOG available options are “morally” framed in such a way to suggest that choosing the equal option is the right thing to do.
Keywords: trade-off game, gender, equality, efficiency
Abstract: Generations of social scientists have explored whether males and females act differently in domains involving competition, risk taking, cooperation, altruism, honesty, as well as many others. Yet, little is known about gender differences in the trade-off between objective equality (i.e., equality of outcomes) and efficiency. It has been suggested that females are more equal than males, but the empirical evidence is relatively weak. This gap is particularly important, because people in power of redistributing resources often face a conflict between equality and efficiency. The recently introduced Trade-Off Game (TOG) – in which a decision-maker has to unilaterally choose between being equal or being efficient – offers a unique opportunity to fill this gap. To this end, I analyse gender differences on a large dataset including N=6,955 TOG decisions. The results show that females prefer objective equality over efficiency to a greater extent than males do. The effect turns out to be particularly strong when the TOG available options are “morally” framed in such a way to suggest that choosing the equal option is the right thing to do.
Keywords: trade-off game, gender, equality, efficiency
Some charities are much more cost-effective than others, which means that they can do more with the same amount of money; yet most donations do not go to the most effective charities. Why is that?
Donors vastly underestimate differences in charities’ effectiveness. Lucius Caviola et al. Judgment and Decision Making, Vol. 15, No. 4, July 2020, pp. 509–516. http://journal.sjdm.org/20/200504/jdm200504.pdf
Abstract: Some charities are much more cost-effective than other charities, which means that they can save many more lives with the same amount of money. Yet most donations do not go to the most effective charities. Why is that? We hypothesized that part of the reason is that people underestimate how much more effective the most effective charities are compared with the average charity. Thus, they do not know how much more good they could do if they donated to the most effective charities. We studied this hypothesis using samples of the general population, students, experts, and effective altruists in six studies. We found that lay people estimated that among charities helping the global poor, the most effective charities are 1.5 times more effective than the average charity (Studies 1 and 2). Effective altruists, in contrast, estimated the difference to be factor 30 (Study 3) and experts estimated the factor to be 100 (Study 4). We found that participants donated more to the most effective charity, and less to an average charity, when informed about the large difference in cost-effectiveness (Study 5). In conclusion, misconceptions about the difference in effectiveness between charities is thus likely one reason, among many, why people donate ineffectively.
Keywords: cost-effectiveness, charitable giving, effective altruism, prosocial behavior, helping
Abstract: Some charities are much more cost-effective than other charities, which means that they can save many more lives with the same amount of money. Yet most donations do not go to the most effective charities. Why is that? We hypothesized that part of the reason is that people underestimate how much more effective the most effective charities are compared with the average charity. Thus, they do not know how much more good they could do if they donated to the most effective charities. We studied this hypothesis using samples of the general population, students, experts, and effective altruists in six studies. We found that lay people estimated that among charities helping the global poor, the most effective charities are 1.5 times more effective than the average charity (Studies 1 and 2). Effective altruists, in contrast, estimated the difference to be factor 30 (Study 3) and experts estimated the factor to be 100 (Study 4). We found that participants donated more to the most effective charity, and less to an average charity, when informed about the large difference in cost-effectiveness (Study 5). In conclusion, misconceptions about the difference in effectiveness between charities is thus likely one reason, among many, why people donate ineffectively.
Keywords: cost-effectiveness, charitable giving, effective altruism, prosocial behavior, helping
Action and inaction are perceived and evaluated differently; these asymmetries have been shown to have real impact on choice behavior in both personal & interpersonal contexts
Omission and commission in judgment and decision making: Understanding and linking action‐inaction effects using the concept of normality. Gilad Feldman Lucas Kutscher Tijen Yay. Social and Personality Psychology Compass, July 27 2020. https://doi.org/10.1111/spc3.12557
Abstract: Research on action and inaction in judgment and decision making now spans over 35 years, with ever‐growing interest. Accumulating evidence suggests that action and inaction are perceived and evaluated differently, affecting a wide array of psychological factors from emotions to morality. These asymmetries have been shown to have real impact on choice behavior in both personal and interpersonal contexts, with implications for individuals and society. We review impactful action‐inaction related phenomena, with a summary and comparison of key findings and insights, reinterpreting these effects and mapping links between effects using norm theory's (Kahneman & Miller, 1986) concept of normality. Together, these aim to contribute towards an integrated understanding of the human psyche regarding action and inaction.
Abstract: Research on action and inaction in judgment and decision making now spans over 35 years, with ever‐growing interest. Accumulating evidence suggests that action and inaction are perceived and evaluated differently, affecting a wide array of psychological factors from emotions to morality. These asymmetries have been shown to have real impact on choice behavior in both personal and interpersonal contexts, with implications for individuals and society. We review impactful action‐inaction related phenomena, with a summary and comparison of key findings and insights, reinterpreting these effects and mapping links between effects using norm theory's (Kahneman & Miller, 1986) concept of normality. Together, these aim to contribute towards an integrated understanding of the human psyche regarding action and inaction.
Subscribe to:
Posts (Atom)