From 2013... Effect of mating activity and dominance rank on male masturbation among free-ranging male rhesus macaques. Constance Dubuc, Sean P. Coyne, and Dario Maestripieri
Ethology. 2013 Nov 1; 119(11): 10.1111/eth.12146.
Abstract: The adaptive function of male masturbation is still poorly understood, despite its high prevalence in humans and other animals. In non-human primates, male masturbation is most frequent among anthropoid monkeys and apes living in multimale-multifemale groups with a promiscuous mating system. In these species, male masturbation may be a non-functional by-product of high sexual arousal or be adaptive by providing advantages in terms of sperm competition or by decreasing the risk of sexually transmitted infections. We investigated the possible functional significance of male masturbation using behavioral data collected on 21 free-ranging male rhesus macaques (Macaca mulatta) at the peak of the mating season. We found some evidence that masturbation is linked to low mating opportunities: regardless of rank, males were most likely to be observed masturbating on days in which they were not observed mating, and lower-ranking males mated less and tended to masturbate more frequently than higher-ranking males. These results echo the findings obtained for two other species of macaques, but contrast those obtained in red colobus monkeys (Procolobus badius) and Cape ground squirrels (Xerus inauris). Interestingly, however, male masturbation events ended with ejaculation in only 15% of the observed masturbation time, suggesting that new hypotheses are needed to explain masturbation in this species. More studies are needed to establish whether male masturbation is adaptive and whether it serves similar or different functions in different sexually promiscuous species.
Keywords: Masturbation, auto-erotism, male-male competition, sexually-transmitted disease, sexual arousal, mating success, dominance rank, rhesus macaques
INTRODUCTION
Masturbation, or self-manipulation of the genitalia, is part of the natural behavioral repertoire of many animal species (reviewed in Bagemihl 1999; Thomsen et al. 2003; Dixson 2012), including humans (Laqueur 2003; Dixson 2012), but whether this behavior has an adaptive function is still poorly understood. Although comparative behavioral data on masturbation could help us understand the adaptive function and evolution of this behavior, very few data are available to date.
Different hypotheses have been proposed to explain the function of male masturbation (reviewed by Waterman 2010; see also Dixson 2012). The “sexual-outlet” hypothesis proposes that masturbation is a non-adaptive by-product of sexual arousal and serves as an alternative outlet to copulation (Kinsey et al. 1948; Dixson & Anderson 2004; Dixon 2012). This by-product hypothesis implies that males do not gain any fitness benefits, in terms of their health or survival, or increased mating or reproductive success, from masturbation. Second, the “ejaculate-quality-improvement” hypothesis posits that masturbation is an adaptive behavior that serves to eliminate degraded gametes or avoid polyzoospermy in order to increase the overall ejaculate quality, thus increasing the probability of impregnation when males copulate with a fertile female (Zimmerman et al. 1965; Baker & Bellis 1993, 1995; Thomsen et al. 2003; Thomsen & Soltis 2004). While suggesting very different functions of masturbation, these two hypotheses both predict that masturbation should be more frequent among males that have little or no opportunity to mate, and/or occur in periods of infrequent mating (Thomsen et al. 2003; Thomsen & Soltis 2004; Dixson & Anderson 2004; Waterman 2010; Dixon 2012). In addition, the ejaculate-quality-improvement hypothesis predicts that males who have infrequent access to females but masturbate frequently should have higher sperm quality and higher probability of impregnation when compared to males who masturbate less frequently, other things being equal. Finally, according to the “STI-reduction” hypothesis, masturbation serves to cleanse the male reproductive tract to decrease the risk of contracting sexually transmitted infections (STIs) (Waterman 2010). This hypothesis has been developed more recently to explain the behavioral pattern observed in the highly promiscuous Cape ground squirrels (Xerus inauris; Waterman 2010). Contrary to the two other hypotheses, the STI-reduction hypothesis predicts that male masturbation should be more prevalent in periods of high sexual activity, performed by males who mate successfully, and occur shortly after copulation (Waterman 2010). Moreover, it predicts that males who masturbate frequently in these circumstances should be less likely to contract sexually transmitted diseases than males who do not masturbate or do so less frequently, and should thus be in overall better health. It should be noted that all three hypotheses are based on the assumption that male masturbation typically leads to ejaculation.
Among nonhuman primates, the occurrence of male masturbation has been documented at the qualitative level for 30 species of Old World monkeys and apes, whereas it is rare or even absent in New World monkeys and prosimians (reviewed in Thomsen et al. 2003; Dixson 2012). Male masturbation is most frequent in anthropoid primates that live in multimale-multifemale groups (Thomsen et al. 2003) and have large testis volume relative to their body size (Dixson & Anderson 2004). While this observation has been interpreted as being suggestive that male masturbation is functionally linked to sperm competition (ejaculate-quality-improvement hypothesis; Thomsen et al. 2003), this observation is also consistent with the sexual-outlet hypothesis because in sexually promiscuous species “males possess neuroendocrine specializations for greater sexual arousal and performance” (Dixson & Anderson 2004, p. 366; see also Dixon 2012, p. 192). Such pattern could also be explained by the STI-hypothesis because sexually transmitted infections are more likely to spread in species with promiscuous mating system (Waterman 2010).
While no primate studies to date have directly investigated the potential fitness benefits of male masturbation, a handful of studies have investigated the functional hypotheses indirectly, by testing their predictions concerning the frequency of masturbation and its potential association with rank and mating activity within species. In free-ranging red colobus monkeys (Procolobus badius), masturbation was performed very rarely (5 instances in 8,950h of observation collected over 5 years) mainly by alpha males and specifically during intergroup encounters (i.e. when rivals are present) taking place when some females were sexually active, with no copulations reported for either resident or extra-group males (Starin 2004). Male masturbation was much more frequent in two macaques species, in which hundreds of instances were observed over less than 1000 hours of observation collected over 1–2 years (Nieuwenhuijsen et al. 1987; Thomsen & Soltis 2004). In free-ranging Japanese macaques (Macaca fuscata), masturbation is more frequent in males of lower mating success and those of lower dominance rank (Thomsen & Soltis 2004; see also Inoue, 2012). No such relation between dominance rank and masturbation frequency was revealed in captive group-living stump-tail macaques (M. arctoides; Nieuwenhuijsen et al. 1987). A closer investigation of the latter study’s data, however, revealed an opposite pattern of distribution of mating and masturbation rate between the alpha male (409 copulations vs. 30 masturbation bouts) and the beta male (30 copulations vs. 543 masturbation bouts), which suggests a relation between rank, mating, and masturbation similar to that reported in Japanese macaques (cf. Table 3 in Nieuwenhuijsen et al. 1987). Overall, the results obtained for macaques seem more consistent with the ejaculate-quality-improvement and sexual-outlet hypotheses than with the STI-reduction one.
In the present study, we examined whether and how access to fertile females influences masturbation rate in free-ranging male rhesus macaques on Cayo Santiago, Puerto Rico. Rhesus macaques are seasonal breeders and on Cayo Santiago that they live in unusually large troops (50–300 individuals). In this rhesus population, male mating and reproductive success are linked to dominance rank, although not strongly (e.g. Berard et al. 1994; Dubuc et al. 2011). High-ranking males form extended consortships with estrous females characterized by frequent copulations and ejaculations, while lower-ranking males mate less frequently and mainly through sneak copulations and short-term associations (e.g. Carpenter 1942; Altmann 1962; Chapais 1983; Berard et al. 1994; Higham et al. 2011). However, middle- and low-ranking males can still enjoy a relatively high reproductive success (e.g. Berard et al. 1994; Dubuc et al. 2011) because high-ranking males are generally unsuccessful at mate-guarding females over the entire course of their fertile phase (Dubuc et al. 2012), thus making it possible for other males to fertilize females through sneaky copulations and sperm competition (see also Bercovitch 1992). While male masturbation has long been known for this species (e.g. Carpenter 1942; Phoenix & Jenson 1973), little is known about the relationship between masturbation and mating activity. Work on captive rhesus macaques has shown that male masturbation takes place even without any sensory contact with females (e.g. Phoenix & Jenson 1973), is eliminated by castration (Phoenix & Jenson 1973; Slimp et al. 1978; Loy et al., 1984), but not by brain lesions that eliminate sexual interactions with females (Slimp et al. 1978).
Here, we explored the possible functional significance of male masturbation by investigating the correlation between masturbation frequency and male dominance rank, and investigating how mating activity influences masturbation behavior in two different ways, by testing (i) whether there is a correlation between masturbation rate and overall mating frequency, and (ii) whether or not males were more likely to masturbate on days in which they were seen mating. Based on previous findings obtained in macaques, we predicted that the pattern of male masturbations will be more consistent with the sexual-outlet and ejaculate-quality-improvement hypotheses than with the STI hypothesis. Specifically, we predicted that (1) low-ranking males and/or least successful males of a social group should be more likely to be observed masturbating, and (2) males should be more likely to masturbate on days in which they do not mate. In addition, we expected (3) masturbation to lead to ejaculation.
DISCUSSION
[...]
In our study, masturbation that did not lead to ejaculation took place in two main contexts (25% of all masturbation bouts each): (1) males manipulated their in penis only once in a period of time in which they emitted a large amount of self-directed behaviors; and (2) males stopped masturbating and started interacting with females in a sexual context, a third of which led to mating (Fig. 3). In the remaining cases, the male changed activity or simply stopped with no obvious change of activity. Based on these observations, we propose two hypotheses to explain male masturbation in rhesus macaques. Firstly, we propose that it may be a form of self-directed behavior emitted in context of intense anxiety (Maestripieri et al. 1992), which could or could not be created by a sexual context itself (‘masturbation-as-SDB’ hypothesis). Masturbation could be more frequent among low-ranked males if their position creates more emotional stress. Alternatively, male masturbation could be aimed at maintaining high level of sexual arousal for males in order to decrease the length of the next mount series and increase the probability of ejaculating through mating (‘sexual-arousal’ hypothesis). In rhesus macaques, mount series can last from 1 to 56 minutes, and long series are more likely to be interrupted by higher-ranking males (Manson 1996). This would be more frequent among non-dominant males that have a low access to females and mate mainly during short-term associations and sneak copulations.
An unequivocal rejection of the null hypothesis that male masturbation is a non-functional by-product of frustrated sexual arousal would require evidence that inter-individual variation in masturbation behavior is associated with variation in male health, emotional stress, ejaculate quality, and/or in fertilization success. While Inoue (2012) showed no correlation between masturbation rate and reproductive success, the fact that mating rate or dominance rank was not taken into account provides little insights about whether males masturbating produced more offspring than predicted based on their mating rate. Some insights into the function of masturbation could also be provided by comparing closely-related primate species that live in multi-male multi-female groups that differ in the extent to which the alpha male effectively monopolizes access to fertile females (and in turn, the intensity of sperm competition) or in their mating pattern (i.e. multiple-mounters vs. single-mounters). Comparing prevalence of male masturbation, the frequency at which it leads to ejaculation, and context in which it takes place within these species could shed some light on whether maintaining a steady supply of high-quality sperm through frequent masturbation is needed to take full advantage of rare opportunities for copulation that become available to individuals who are otherwise consistently prevented from copulating.
Thursday, October 17, 2019
In the nations with the most abject poverty, we observed substantial (~30–40%) genetic influence on cognitive abilities; shared environmental influences were similar to those found in adolescents growing in affluent countries
Genetic and Environmental Influences on Cognitive Abilities in Extreme Poverty. Yoon-Mi Hur and Timothy Bates. Twin Research and Human Genetics, October 17 2019. https://doi.org/10.1017/thg.2019.92
Abstract: To improve global human capital, an understanding of the interplay of endowment across the full range of socioeconomic status (SES) is needed. Relevant data, however, are absent in the nations with the most abject poverty (Tucker-Drob & Bates, 2016), where the lowest heritability and strong effects of SES are predicted. Here we report the first study of biopsychosocial gene–environment interaction in extreme poverty. In a sub-Saharan sample of early teenage twins (N = 3192), we observed substantial (~30–40%) genetic influence on cognitive abilities. Surprisingly, shared environmental influences were similar to those found in adolescents growing in Western affluent countries (25–28%). G × SES moderation was estimated at aˋ = .06 (p = .355). Family chaos did not moderate genetic effects but did moderate shared environment influence. Heritability of cognitive abilities in extreme poverty appears comparable to Western data. Reduced family chaos may be a modifiable factor promoting cognitive development.
Abstract: To improve global human capital, an understanding of the interplay of endowment across the full range of socioeconomic status (SES) is needed. Relevant data, however, are absent in the nations with the most abject poverty (Tucker-Drob & Bates, 2016), where the lowest heritability and strong effects of SES are predicted. Here we report the first study of biopsychosocial gene–environment interaction in extreme poverty. In a sub-Saharan sample of early teenage twins (N = 3192), we observed substantial (~30–40%) genetic influence on cognitive abilities. Surprisingly, shared environmental influences were similar to those found in adolescents growing in Western affluent countries (25–28%). G × SES moderation was estimated at aˋ = .06 (p = .355). Family chaos did not moderate genetic effects but did moderate shared environment influence. Heritability of cognitive abilities in extreme poverty appears comparable to Western data. Reduced family chaos may be a modifiable factor promoting cognitive development.
From 2016... Participants were reminded of death (vs. control) and evaluated new, 20‐, or 100‐year‐old objects; death reminders resulted in greater valuation of older objects
From 2016... When existence is not futile: The influence of mortality salience on the longer‐is‐better effect. Simon McCabe, Melissa R. Spina, Jamie Arndt. British Journal of Social Psychology, April 4 2016. https://doi.org/10.1111/bjso.12143
Abstract: This research examines how death reminders impact the valuation of objects of various ages. Building from the existence bias, the longer‐is‐better effect posits that which exists is good and that which has existed for longer is better. Integrating terror management theory, it was reasoned that mortality reminders fostering a motivation to at least symbolically transcend death would lead participants to evaluate older object more positively as they signal robustness of existence. Participants were reminded of death (vs. control) and evaluated new, 20‐, or 100‐year‐old objects. Results indicated death reminders resulted in greater valuation of older objects. Findings are discussed with implications for terror management theory, the longer‐is‐better effect, ageism, materialism, and consumer behaviour.
Abstract: This research examines how death reminders impact the valuation of objects of various ages. Building from the existence bias, the longer‐is‐better effect posits that which exists is good and that which has existed for longer is better. Integrating terror management theory, it was reasoned that mortality reminders fostering a motivation to at least symbolically transcend death would lead participants to evaluate older object more positively as they signal robustness of existence. Participants were reminded of death (vs. control) and evaluated new, 20‐, or 100‐year‐old objects. Results indicated death reminders resulted in greater valuation of older objects. Findings are discussed with implications for terror management theory, the longer‐is‐better effect, ageism, materialism, and consumer behaviour.
People’s tendency to deem bearers of bad news as unlikeable stems in part from their desire to make sense of chance processes; dislike is mitigated when messenger’s motives are benevolent
John, L. K., Blunden, H., & Liu, H. (2019). Shooting the messenger. Journal of Experimental Psychology: General, 148(4), 644-666. OCt 2019. http://dx.doi.org/10.1037/xge0000586
Abstract: Eleven experiments provide evidence that people have a tendency to “shoot the messenger,” deeming innocent bearers of bad news unlikeable. In a preregistered lab experiment, participants rated messengers who delivered bad news from a random drawing as relatively unlikeable (Study 1). A second set of studies points to the specificity of the effect: Study 2A shows that it is unique to the (innocent) messenger, and not mere bystanders. Study 2B shows that it is distinct from merely receiving information with which one disagrees. We suggest that people’s tendency to deem bearers of bad news as unlikeable stems in part from their desire to make sense of chance processes. Consistent with this account, receiving bad news activates the desire to sense-make (Study 3A), and in turn, activating this desire enhances the tendency to dislike bearers of bad news (Study 3B). Next, stemming from the idea that unexpected outcomes heighten the desire to sense-make, Study 4 shows that when bad news is unexpected, messenger dislike is pronounced. Finally, consistent with the notion that people fulfill the desire to sense-make by attributing agency to entities adjacent to chance events, messenger dislike is correlated with the erroneous belief that the messenger had malevolent motives (Studies 5A, 5B, and 5C). Studies 6A and 6B go further, manipulating messenger motives independently from news valence to suggest their causal role in our process account: the tendency to dislike bearers of bad news is mitigated when recipients are made aware of the benevolence of the messenger’s motives.
Abstract: Eleven experiments provide evidence that people have a tendency to “shoot the messenger,” deeming innocent bearers of bad news unlikeable. In a preregistered lab experiment, participants rated messengers who delivered bad news from a random drawing as relatively unlikeable (Study 1). A second set of studies points to the specificity of the effect: Study 2A shows that it is unique to the (innocent) messenger, and not mere bystanders. Study 2B shows that it is distinct from merely receiving information with which one disagrees. We suggest that people’s tendency to deem bearers of bad news as unlikeable stems in part from their desire to make sense of chance processes. Consistent with this account, receiving bad news activates the desire to sense-make (Study 3A), and in turn, activating this desire enhances the tendency to dislike bearers of bad news (Study 3B). Next, stemming from the idea that unexpected outcomes heighten the desire to sense-make, Study 4 shows that when bad news is unexpected, messenger dislike is pronounced. Finally, consistent with the notion that people fulfill the desire to sense-make by attributing agency to entities adjacent to chance events, messenger dislike is correlated with the erroneous belief that the messenger had malevolent motives (Studies 5A, 5B, and 5C). Studies 6A and 6B go further, manipulating messenger motives independently from news valence to suggest their causal role in our process account: the tendency to dislike bearers of bad news is mitigated when recipients are made aware of the benevolence of the messenger’s motives.
There is no indication for potential devastating effects of social media on school achievement; social media use and school grades are unrelated for adolescents
Are Social Media Ruining Our Lives? A Review of Meta-Analytic Evidence. Markus Appel, Caroline Marker, Timo Gnambs. Review of General Psychology, October 16, 2019. https://doi.org/10.1177/1089268019880891
Abstract: A growing number of studies have examined the psychological corollaries of using social networking sites (SNSs) such as Facebook, Instagram, or Twitter (often called social media). The interdisciplinary research area and conflicting evidence from primary studies complicate the assessment of current scholarly knowledge in this field of high public attention. We review meta-analytic evidence on three hotly debated topics regarding the effects of SNSs: well-being, academic achievement, and narcissism. Meta-analyses from different laboratories draw a rather equivocal picture. They show small associations in the r = .10 range between the intensity of SNS use and loneliness, self-esteem, life satisfaction, or self-reported depression, and somewhat stronger links to a thin body ideal and higher social capital. There is no indication for potential devastating effects of social media on school achievement; social media use and school grades are unrelated for adolescents. The meta-analyses revealed small to moderate associations between narcissism and SNS use. In sum, meta-analytic evidence is not in support of dramatic claims relating social media use to mischief.
Keywords social media, meta-analysis, narcissism, achievement, well-being
Abstract: A growing number of studies have examined the psychological corollaries of using social networking sites (SNSs) such as Facebook, Instagram, or Twitter (often called social media). The interdisciplinary research area and conflicting evidence from primary studies complicate the assessment of current scholarly knowledge in this field of high public attention. We review meta-analytic evidence on three hotly debated topics regarding the effects of SNSs: well-being, academic achievement, and narcissism. Meta-analyses from different laboratories draw a rather equivocal picture. They show small associations in the r = .10 range between the intensity of SNS use and loneliness, self-esteem, life satisfaction, or self-reported depression, and somewhat stronger links to a thin body ideal and higher social capital. There is no indication for potential devastating effects of social media on school achievement; social media use and school grades are unrelated for adolescents. The meta-analyses revealed small to moderate associations between narcissism and SNS use. In sum, meta-analytic evidence is not in support of dramatic claims relating social media use to mischief.
Keywords social media, meta-analysis, narcissism, achievement, well-being
Enriched Environment Exposure Accelerates Rodent Driving Skills
Enriched Environment Exposure Accelerates Rodent Driving Skills.
L.E.Crawford et al. Behavioural Brain Research, October 16 2019, 112309.
https://doi.org/10.1016/j.bbr.2019.112309
• Enriched environments enhance competency in a rodent driving task.
• Driving rats maintained an interest in the car through extinction.
• Tasks incorporating complex skill mastery are important for translational research.
Highlights
• Rats can learn the complex task of navigating a car to a desired goal
area.• Enriched environments enhance competency in a rodent driving task.
• Driving rats maintained an interest in the car through extinction.
• Tasks incorporating complex skill mastery are important for translational research.
ABSTRACT: Although rarely used, long-term behavioral training protocols
provide opportunities to shape complex skills in rodent laboratory
investigations that incorporate cognitive, motor, visuospatial and temporal
functions to achieve desired goals. In the current study, following preliminary
research establishing that rats could be taught to drive a rodent operated
vehicle (ROV) in a forward direction, as well as steer in more complex
navigational patterns, male rats housed in an enriched environment were exposed
to the rodent driving regime. Compared to standard-housed rats, enriched-housed
rats demonstrated more robust learning in driving performance and their
interest in the ROV persisted through extinction trials.
Dehydroepiandrosterone/corticosterone (DHEA/CORT) metabolite ratios in fecal
samples increased in accordance with training in all animals, suggesting that
driving training, regardless of housing group, enhanced markers of emotional
resilience. These results confirm the importance of enriched environments in
preparing animals to engage in complex behavioral tasks. Further, behavioral
models that include trained motor skills enable researchers to assess subtle
alterations in motivation and behavioral response patterns that are relevant
for translational research related to neurodegenerative disease and psychiatric
illness.
Public discourse is often caustic and conflict-filled; the present work sought to examine a potentially novel explanatory mechanism defined in philosophical literature: Moral Grandstanding
Grubbs JB, Warmke B, Tosi J, James AS, Campbell WK (2019) Moral grandstanding
in public discourse: Status-seeking motives as a potential explanatory
mechanism in predicting conflict. PLoS ONE 14(10): e0223749, October 16, 2019. https://doi.org/10.1371/journal.pone.0223749
Abstract: Public discourse is often caustic and conflict-filled. This
trend seems to be particularly evident when the content of such discourse is
around moral issues (broadly defined) and when the discourse occurs on social
media. Several explanatory mechanisms for such conflict have been explored in
recent psychological and social-science literatures. The present work sought to
examine a potentially novel explanatory mechanism defined in philosophical
literature: Moral Grandstanding. According to philosophical accounts, Moral
Grandstanding is the use of moral talk to seek social status. For the present
work, we conducted six studies, using two undergraduate samples (Study 1, N =
361; Study 2, N = 356); a sample matched to U.S. norms for age, gender, race,
income, Census region (Study 3, N = 1,063); a YouGov sample matched to U.S.
demographic norms (Study 4, N = 2,000); and a brief, one-month longitudinal
study of Mechanical Turk workers in the U.S. (Study 5, Baseline N = 499,
follow-up n = 296), and a large, one-week YouGov sample matched to U.S.
demographic norms (Baseline N = 2,519, follow-up n = 1,776). Across studies, we
found initial support for the validity of Moral Grandstanding as a construct.
Specifically, moral grandstanding motivation was associated with status-seeking
personality traits, as well as greater political and moral conflict in daily
life.
Amateurs pay trainers $1000 to $10,000 to enter marathons with them; ‘like a little concierge service’
Want to Win That Race? Hire a Coach to Stay Alongside You and Carry Your Phone. Hilary Potkewitz. The Wall Street Journal, October 17, 2019. https://www.wsj.com/articles/these-coaches-do-everything-but-carry-clients-over-the-finish-line-11571148264
Amateurs pay trainers to enter marathons with them; ‘like a little concierge service’
Diane Reynolds had been racing for a few months when she won her first amateur cycling event, the Farm to Fork Fondo near upstate New York’s Finger Lakes in August. She left more than 500 riders in the dust, including all the men.
A little help
The win earned the 49-year-old novice a jersey decorated with polka-dot chickens, but it didn’t come cheap: She paid about $1,000 for former pro cyclist Hunter Allen to ride all 84 miles with her as a private coach.
Mr. Allen, 50, gave her real-time pointers on pacing, technical skills and race strategy. He also ran interference for her. “Early on, there were about 10 guys riding hard taking turns up front—I was one of them—and I knew we were going to break away from the peloton,” or main group of riders, he says. “I made sure Diane stayed with us, sheltered in the middle and conserved her energy as we widened the gap.”
Dr. Reynolds, an anesthesiologist in Knoxville, Tenn., ended up setting a record as the only female overall winner in 27 Farm to Fork Fondo events, according to organizers. “It wasn’t like I qualified for the Olympics, but I hit my personal goal and that was a great feeling,” she says.
While coaches have long attended amateur races to support their clients from the sidelines, more are joining them on the starting line as pacers-for-hire. Instead of competing outright, they agree to race alongside their client as an ally. That means registering, putting on a bib and finishing with an official time—often torpedoing their own race to help a client achieve a goal.
“I haven’t run a marathon for myself since 2010,” says New York-based running coach John Honerkamp, who is training for November’s New York City Marathon.
This will be his ninth year of shepherding celebrity clients through the finish line. Not that he’s complaining: In 2017, he helped supermodel Karlie Kloss cross the tape in 4 hours, 41 minutes and 49 seconds. In 2014 he paced tennis pro Caroline Wozniacki to a sterling time of 3:26.33. The last time he marathoned alone he finished in 2:44.22.
When he’s working a race, he’ll carry his client’s energy gels and cellphone, zip ahead to grab water or Gatorade, and block the wind to let them run in his draft. “I’m like a little concierge service,” Mr. Honerkamp says.
His fee starts at $5,000 and increases on a sliding scale based on time and effort involved. For a sub-three-hour finish, he charges about $10,000.
“I have to put a lot of work in to break three hours. I can’t just wing it,” the 44-year-old says. As a rule of thumb, he says he needs to train at a pace about 20 minutes faster than his client’s target. “The key is to be as relaxed as possible during the race so the person I’m pacing is comforted,” he says. “If I’m gasping for air, I can’t do that as well.”
Not all endurance sports take the same view of this bespoke training. Some see it as an unfair advantage to a few highly competitive amateurs who can afford it. Others are more blunt. “It’s cheating,” says Melissa Mantak, a triathlon coach in Denver. She was at a recent Ironman event in Boulder, Colo., coaching one of her athletes—from the sidelines, she says, where a coach belongs—when a nearby runner started to flag. The young woman’s pacer kept her in the race. “I could see him physically pushing her forward,” Ms. Mantak says. “I wish I’d taken a picture.”
The young woman finished second in her age group, nabbing a coveted entry to the Ironman World Championships in Kona, Hawaii. Ms. Mantak’s client finished third and missed the cut. “It was very upsetting,” Ms. Mantak says.
Ironman and internationally sanctioned triathlons forbid racing as a team, among other rules. “The race is meant to be as much of an individual effort as possible,” says Jimmy Riccitello, Ironman head referee.
The rules don’t address coach-client racing specifically, he says, and violations are hard to prove. If a racer competing for herself sees a client struggling midrace and gives encouragement or helps with a flat bike tire, that’s probably OK, he says. But if someone’s sole purpose for racing is to help a client qualify for Kona? “That goes against the spirit of triathlon,” Mr. Riccitello says.
Cycling is often a team sport, and a coach-client pairing is just another team, says Farm to Fork Fondo director Tyler Wren. He says the rides often draw cycling clubs and people with coaches training for other events.
This year’s Finger Lakes course had several flat sections ideal for race training, says Mr. Allen, the coach. When crosswinds picked up, he organized their group of 10 riders into a diagonal line called an echelon, a strategy to conserve energy more common in elite cycling.
“Hunter was doing his coaching thing, telling everyone how to ride in the wind and showing how an echelon worked,” Ms. Reynolds says. “They were all pretty psyched.”
Most marathons, including New York, have official pacers who run designated finish times for racers wondering about their own pace. Hiring a private pacer is just an iteration of that, says Rich Harshbarger, CEO of Running USA.
Mr. Honerkamp is training harder than usual for this year’s New York Marathon because he’s pacing chef Daniel Humm, a longtime client on a mission to break three hours. The pair teamed up for the 2018 race, running most of the way with retired competitive marathon star and mutual friend Meb Keflezighi and finishing with a time of about 3:10. That disappointed Mr. Humm.
“I think we could have broken three hours last year, but we were having fun with Meb and joking around a bit,” says Mr. Humm, 43. “I told John, ‘That won’t happen this year.’ ”
Amateurs pay trainers to enter marathons with them; ‘like a little concierge service’
Diane Reynolds had been racing for a few months when she won her first amateur cycling event, the Farm to Fork Fondo near upstate New York’s Finger Lakes in August. She left more than 500 riders in the dust, including all the men.
A little help
The win earned the 49-year-old novice a jersey decorated with polka-dot chickens, but it didn’t come cheap: She paid about $1,000 for former pro cyclist Hunter Allen to ride all 84 miles with her as a private coach.
Mr. Allen, 50, gave her real-time pointers on pacing, technical skills and race strategy. He also ran interference for her. “Early on, there were about 10 guys riding hard taking turns up front—I was one of them—and I knew we were going to break away from the peloton,” or main group of riders, he says. “I made sure Diane stayed with us, sheltered in the middle and conserved her energy as we widened the gap.”
Dr. Reynolds, an anesthesiologist in Knoxville, Tenn., ended up setting a record as the only female overall winner in 27 Farm to Fork Fondo events, according to organizers. “It wasn’t like I qualified for the Olympics, but I hit my personal goal and that was a great feeling,” she says.
While coaches have long attended amateur races to support their clients from the sidelines, more are joining them on the starting line as pacers-for-hire. Instead of competing outright, they agree to race alongside their client as an ally. That means registering, putting on a bib and finishing with an official time—often torpedoing their own race to help a client achieve a goal.
“I haven’t run a marathon for myself since 2010,” says New York-based running coach John Honerkamp, who is training for November’s New York City Marathon.
This will be his ninth year of shepherding celebrity clients through the finish line. Not that he’s complaining: In 2017, he helped supermodel Karlie Kloss cross the tape in 4 hours, 41 minutes and 49 seconds. In 2014 he paced tennis pro Caroline Wozniacki to a sterling time of 3:26.33. The last time he marathoned alone he finished in 2:44.22.
When he’s working a race, he’ll carry his client’s energy gels and cellphone, zip ahead to grab water or Gatorade, and block the wind to let them run in his draft. “I’m like a little concierge service,” Mr. Honerkamp says.
His fee starts at $5,000 and increases on a sliding scale based on time and effort involved. For a sub-three-hour finish, he charges about $10,000.
“I have to put a lot of work in to break three hours. I can’t just wing it,” the 44-year-old says. As a rule of thumb, he says he needs to train at a pace about 20 minutes faster than his client’s target. “The key is to be as relaxed as possible during the race so the person I’m pacing is comforted,” he says. “If I’m gasping for air, I can’t do that as well.”
Not all endurance sports take the same view of this bespoke training. Some see it as an unfair advantage to a few highly competitive amateurs who can afford it. Others are more blunt. “It’s cheating,” says Melissa Mantak, a triathlon coach in Denver. She was at a recent Ironman event in Boulder, Colo., coaching one of her athletes—from the sidelines, she says, where a coach belongs—when a nearby runner started to flag. The young woman’s pacer kept her in the race. “I could see him physically pushing her forward,” Ms. Mantak says. “I wish I’d taken a picture.”
The young woman finished second in her age group, nabbing a coveted entry to the Ironman World Championships in Kona, Hawaii. Ms. Mantak’s client finished third and missed the cut. “It was very upsetting,” Ms. Mantak says.
Ironman and internationally sanctioned triathlons forbid racing as a team, among other rules. “The race is meant to be as much of an individual effort as possible,” says Jimmy Riccitello, Ironman head referee.
The rules don’t address coach-client racing specifically, he says, and violations are hard to prove. If a racer competing for herself sees a client struggling midrace and gives encouragement or helps with a flat bike tire, that’s probably OK, he says. But if someone’s sole purpose for racing is to help a client qualify for Kona? “That goes against the spirit of triathlon,” Mr. Riccitello says.
Cycling is often a team sport, and a coach-client pairing is just another team, says Farm to Fork Fondo director Tyler Wren. He says the rides often draw cycling clubs and people with coaches training for other events.
This year’s Finger Lakes course had several flat sections ideal for race training, says Mr. Allen, the coach. When crosswinds picked up, he organized their group of 10 riders into a diagonal line called an echelon, a strategy to conserve energy more common in elite cycling.
“Hunter was doing his coaching thing, telling everyone how to ride in the wind and showing how an echelon worked,” Ms. Reynolds says. “They were all pretty psyched.”
Most marathons, including New York, have official pacers who run designated finish times for racers wondering about their own pace. Hiring a private pacer is just an iteration of that, says Rich Harshbarger, CEO of Running USA.
Mr. Honerkamp is training harder than usual for this year’s New York Marathon because he’s pacing chef Daniel Humm, a longtime client on a mission to break three hours. The pair teamed up for the 2018 race, running most of the way with retired competitive marathon star and mutual friend Meb Keflezighi and finishing with a time of about 3:10. That disappointed Mr. Humm.
“I think we could have broken three hours last year, but we were having fun with Meb and joking around a bit,” says Mr. Humm, 43. “I told John, ‘That won’t happen this year.’ ”
Long-married couples recall their wedding day: the influence of collaboration and gender on autobiographical memory recall
Long-married couples recall their wedding day: the influence of collaboration and gender on autobiographical memory recall. Azriel Grysman et al. Memory, Oct 15 2019. https://doi.org/10.1080/09658211.2019.1673428
ABSTRACT: The current study examined the influence of collaboration, expertise, and communication on autobiographical memory, by considering gender differences in recall and how they may influence the products and processes of remembering when male-female couples recall events together. Thirty-nine long-married, male-female couples recalled their memories of their wedding day. In Session 1, they recalled it individually for an experimenter. One week later, in Session 2, they recalled the same event jointly as a collaborative pair. Women reported more details, especially episodic details, than men across both sessions. Notably, collaborative recall included many new details that neither spouse had recalled individually. Exploratory analyses suggest that women were less influenced by collaboration than were men: women’s communication behaviours influenced men’s recall, but the reverse was not found for men’s communication. Additionally, when couples’ individual recall was more similar in content, men were more likely to decrease their contribution to the collaborative session. We consider these findings in light of transactive memory theory, in which joint meta-memory and the distribution of expertise influence the processes and products of recall in the interdependent system of a couple who extensively share their autobiographical memories.
KEYWORDS: Autobiographical memory, memory collaboration, gender, transactive memory
ABSTRACT: The current study examined the influence of collaboration, expertise, and communication on autobiographical memory, by considering gender differences in recall and how they may influence the products and processes of remembering when male-female couples recall events together. Thirty-nine long-married, male-female couples recalled their memories of their wedding day. In Session 1, they recalled it individually for an experimenter. One week later, in Session 2, they recalled the same event jointly as a collaborative pair. Women reported more details, especially episodic details, than men across both sessions. Notably, collaborative recall included many new details that neither spouse had recalled individually. Exploratory analyses suggest that women were less influenced by collaboration than were men: women’s communication behaviours influenced men’s recall, but the reverse was not found for men’s communication. Additionally, when couples’ individual recall was more similar in content, men were more likely to decrease their contribution to the collaborative session. We consider these findings in light of transactive memory theory, in which joint meta-memory and the distribution of expertise influence the processes and products of recall in the interdependent system of a couple who extensively share their autobiographical memories.
KEYWORDS: Autobiographical memory, memory collaboration, gender, transactive memory
Wednesday, October 16, 2019
In a cohort study of 82 232 participants, personality traits in adolescence—a time when dementia pathology is unlikely to be present—were a factor associated with incident dementia almost 5 decades later
Association Between High School Personality Phenotype and Dementia 54 Years Later in Results From a National US Sample. Benjamin P. Chapman et al. JAMA Psychiatry, October 16, 2019. doi:10.1001/jamapsychiatry.2019.3120.
Key Points
Question Are maladaptive personality traits true risk factors for dementia or merely early expressions of underlying neuropathologic changes?
Findings In a cohort study of 82 232 participants, personality traits in adolescence—a time when dementia pathology is unlikely to be present—were a factor associated with incident dementia almost 5 decades later in a national US cohort. Calm and mature adolescents were less likely to develop dementia, and this risk reduction was significantly more pronounced with higher socioeconomic status.
Meaning This study’s findings suggest that maladaptive personality traits decades earlier may be independent risk factors for dementia by age 70 years.
Abstract
Importance Personality phenotype has been associated with subsequent dementia in studies of older adults. However, neuropathologic changes often precede cognitive symptoms by many years and may affect personality itself. Therefore, it is unclear whether supposed dementia-prone personality profiles (high neuroticism and low conscientiousness) are true risk factors or merely reflections of preexisting disease.
Objectives To examine whether personality during adolescence—a time when preclinical dementia pathology is unlikely to be present—confers risk of dementia in later life and to test whether associations could be accounted for by health factors in adolescence or differed across socioeconomic status (SES).
Design, Setting, and Participants Cohort study in the United States. Participants were members of Project Talent, a national sample of high school students in 1960. Individuals were identified who received a dementia-associated International Classification of Diseases, Ninth Revision (ICD-9) diagnosis code during any year between 2011 and 2013. The dates of our analysis were March 2018 to May 2019.
Exposures Ten personality traits were measured by the 150-item Project Talent Personality Inventory. Socioeconomic status was measured by a composite based on parental educational level, income, occupation, and property ownership. Participants were also surveyed on demographic factors and height and weight.
Main Outcomes and Measures Medicare records were collected, with dementia diagnoses in the period of 2011 to 2013 classified according to the US Centers for Medicare & Medicaid Services ICD-9–based algorithm. Cox proportional hazards regression models estimated the relative risk of dementia based on the 10 personality traits, testing interactions with SES and adjusting for demographic confounders.
Results The sample of 82 232 participants was 50.1% female, with a mean (SD) age of 15.8 (1.7) years at baseline and 69.5 (1.2) years at follow-up. Lower risk of dementia was associated with higher levels of vigor (hazard ratio for 1 SD, 0.93; 95% CI, 0.90-0.97; P < .001). Calm and maturity showed protective associations with later dementia that increased with SES. At 1 SD of SES, calm showed a hazard ratio of 0.89 (95% CI, 0.84-0.95; P < .001 for the interaction) and maturity showed a hazard ratio of 0.90 (95% CI, 0.85-0.96; P = .001 for the interaction).
Conclusions and Relevance This study’s findings suggest that the adolescent personality traits associated with later-life dementia are similar to those observed in studies of older persons. Moreover, the reduction in dementia risk associated with a calm and mature adolescent phenotype may be greater at higher levels of SES. Personality phenotype may be a true independent risk factor for dementia by age 70 years, preceding it by almost 5 decades and interacting with adolescent socioeconomic conditions.
Key Points
Question Are maladaptive personality traits true risk factors for dementia or merely early expressions of underlying neuropathologic changes?
Findings In a cohort study of 82 232 participants, personality traits in adolescence—a time when dementia pathology is unlikely to be present—were a factor associated with incident dementia almost 5 decades later in a national US cohort. Calm and mature adolescents were less likely to develop dementia, and this risk reduction was significantly more pronounced with higher socioeconomic status.
Meaning This study’s findings suggest that maladaptive personality traits decades earlier may be independent risk factors for dementia by age 70 years.
Abstract
Importance Personality phenotype has been associated with subsequent dementia in studies of older adults. However, neuropathologic changes often precede cognitive symptoms by many years and may affect personality itself. Therefore, it is unclear whether supposed dementia-prone personality profiles (high neuroticism and low conscientiousness) are true risk factors or merely reflections of preexisting disease.
Objectives To examine whether personality during adolescence—a time when preclinical dementia pathology is unlikely to be present—confers risk of dementia in later life and to test whether associations could be accounted for by health factors in adolescence or differed across socioeconomic status (SES).
Design, Setting, and Participants Cohort study in the United States. Participants were members of Project Talent, a national sample of high school students in 1960. Individuals were identified who received a dementia-associated International Classification of Diseases, Ninth Revision (ICD-9) diagnosis code during any year between 2011 and 2013. The dates of our analysis were March 2018 to May 2019.
Exposures Ten personality traits were measured by the 150-item Project Talent Personality Inventory. Socioeconomic status was measured by a composite based on parental educational level, income, occupation, and property ownership. Participants were also surveyed on demographic factors and height and weight.
Main Outcomes and Measures Medicare records were collected, with dementia diagnoses in the period of 2011 to 2013 classified according to the US Centers for Medicare & Medicaid Services ICD-9–based algorithm. Cox proportional hazards regression models estimated the relative risk of dementia based on the 10 personality traits, testing interactions with SES and adjusting for demographic confounders.
Results The sample of 82 232 participants was 50.1% female, with a mean (SD) age of 15.8 (1.7) years at baseline and 69.5 (1.2) years at follow-up. Lower risk of dementia was associated with higher levels of vigor (hazard ratio for 1 SD, 0.93; 95% CI, 0.90-0.97; P < .001). Calm and maturity showed protective associations with later dementia that increased with SES. At 1 SD of SES, calm showed a hazard ratio of 0.89 (95% CI, 0.84-0.95; P < .001 for the interaction) and maturity showed a hazard ratio of 0.90 (95% CI, 0.85-0.96; P = .001 for the interaction).
Conclusions and Relevance This study’s findings suggest that the adolescent personality traits associated with later-life dementia are similar to those observed in studies of older persons. Moreover, the reduction in dementia risk associated with a calm and mature adolescent phenotype may be greater at higher levels of SES. Personality phenotype may be a true independent risk factor for dementia by age 70 years, preceding it by almost 5 decades and interacting with adolescent socioeconomic conditions.
Wild bonobos consume meat at higher rates than previously thought; the individual controlling the carcass frequently resisted sharing, and aggressive attempts to take the carcass were observed
New Observations of Meat Eating and Sharing in Wild Bonobos (Pan paniscus) at Iyema, Lomako Forest Reserve, Democratic Republic of the Congo. Wakefield M.L. · Hickmott A.J. · Brand C.M. · Takaoka I.Y. · Meador L.M. · Waller M.T. · White F. Folia Primatol 2019;90:179–189. https://doi.org/10.1159/000496026
Abstract: Bonobos (Pan paniscus) consume a variety of vertebrates, although direct observations remain relatively rare compared to chimpanzees (Pan troglodytes). We report the first direct observations of meat eating and sharing among bonobos at Iyema, Lomako Forest, Democratic Republic of Congo. We collected meat consumption data ad libitum from June to November 2017 over 176.5 observation hours and conducted monthly censuses to measure the abundance of potential prey species. We observed 3 occasions of duiker consumption and found indirect evidence of meat consumption twice (n = 5). We identified the prey species as Weyn’s duiker (Cephalophus weynsi) in all 4 cases that we saw the carcass. This species was the most abundant duiker species at Iyema, but other potential prey species were also available. Meat sharing was observed or inferred during all 3 observations. However, the individual controlling the carcass frequently resisted sharing, and aggressive attempts to take the carcass were observed. This report contributes to a growing body of data suggesting that wild bonobos consume meat at higher rates than previously thought, female control of carcasses is frequent but not exclusive, and meat sharing in bonobos is primarily passive but not without aggression.
Keywords: Behavioral diversityFaunivoryFood sharingPrey preferenceFemale control
Check also Faux-nobo: “Naked Bonobo” demolishes myth of sexy, egalitarian bonobos. Edward Clint. Oct 9, 2017. https://www.bipartisanalliance.com/2017/10/naked-bonobo-demolishes-myth-of-sexy.html
And Bonobos Prefer Individuals that Hinder Others over Those that Help. Christopher Krupenye, Brian Hare. Current Biology, https://www.bipartisanalliance.com/2018/01/whereas-humans-already-prefer-helpers.html
Abstract: Bonobos (Pan paniscus) consume a variety of vertebrates, although direct observations remain relatively rare compared to chimpanzees (Pan troglodytes). We report the first direct observations of meat eating and sharing among bonobos at Iyema, Lomako Forest, Democratic Republic of Congo. We collected meat consumption data ad libitum from June to November 2017 over 176.5 observation hours and conducted monthly censuses to measure the abundance of potential prey species. We observed 3 occasions of duiker consumption and found indirect evidence of meat consumption twice (n = 5). We identified the prey species as Weyn’s duiker (Cephalophus weynsi) in all 4 cases that we saw the carcass. This species was the most abundant duiker species at Iyema, but other potential prey species were also available. Meat sharing was observed or inferred during all 3 observations. However, the individual controlling the carcass frequently resisted sharing, and aggressive attempts to take the carcass were observed. This report contributes to a growing body of data suggesting that wild bonobos consume meat at higher rates than previously thought, female control of carcasses is frequent but not exclusive, and meat sharing in bonobos is primarily passive but not without aggression.
Keywords: Behavioral diversityFaunivoryFood sharingPrey preferenceFemale control
Check also Faux-nobo: “Naked Bonobo” demolishes myth of sexy, egalitarian bonobos. Edward Clint. Oct 9, 2017. https://www.bipartisanalliance.com/2017/10/naked-bonobo-demolishes-myth-of-sexy.html
And Bonobos Prefer Individuals that Hinder Others over Those that Help. Christopher Krupenye, Brian Hare. Current Biology, https://www.bipartisanalliance.com/2018/01/whereas-humans-already-prefer-helpers.html
Studies in Repetitive Transcranial Magnetic Stimulation: The somewhat high frequency of “positive” results seems spurious and may reflect bias; caution is warranted in accepting rTMS as an established treatment
Amad A, Jardri R, Rousseau C, Larochelle Y, Ioannidis J, P, A, Naudet F: Excess Significance Bias in Repetitive Transcranial Magnetic Stimulation Literature for Neuropsychiatric Disorders. Psychother Psychosom 2019. https://doi.org/10.1159/000502805
Abstract
Introduction: Repetitive transcranial magnetic stimulation (rTMS) has been widely tested and promoted for use in multiple neuropsychiatric conditions, but as for many other medical devices, some gaps may exist in the literature and the evidence base for the clinical efficacy of rTMS remains under debate.
Objective: We aimed to test for an excess number of statistically significant results in the literature on the therapeutic efficacy of rTMS across a wide range of meta-analyses and to characterize the power of studies included in these meta-analyses.
Methods: Based on power calculations, we computed the expected number of “positive” datasets for a medium effect size (standardized mean difference, SMD = 0.30) and compared it with the number of observed “positive” datasets. Sensitivity analyses considered small (SMD = 0.20), modest (SMD = 0.50), and large (SMD = 0.80) effect sizes.
Results: A total of 14 meta-analyses with 228 datasets (110 for neurological disorders and 118 for psychiatric disorders) were assessed. For SMD = 0.3, the number of observed “positive” studies (n = 94) was larger than expected (n = 35). We found evidence for an excess of significant findings overall (p < 0.0001) and in 8/14 meta-analyses. Evidence for an excess of significant findings was also observed for SMD = 0.5 for neurological disorders. Of the 228 datasets, 0 (0%), 0 (0%), 3 (1%), and 53 (23%) had a power >0.80, respectively, for SMDs of 0.30, 0.20, 0.50, and 0.80.
Conclusion: Most studies in the rTMS literature are underpowered. This results in fragmentation and waste of research efforts. The somewhat high frequency of “positive” results seems spurious and may reflect bias. Caution is warranted in accepting rTMS as an established treatment for neuropsychiatric conditions.
Keywords: Repetitive transcranial magnetic stimulationRandomized controlled trialMeta-analysisExcess significance
Abstract
Introduction: Repetitive transcranial magnetic stimulation (rTMS) has been widely tested and promoted for use in multiple neuropsychiatric conditions, but as for many other medical devices, some gaps may exist in the literature and the evidence base for the clinical efficacy of rTMS remains under debate.
Objective: We aimed to test for an excess number of statistically significant results in the literature on the therapeutic efficacy of rTMS across a wide range of meta-analyses and to characterize the power of studies included in these meta-analyses.
Methods: Based on power calculations, we computed the expected number of “positive” datasets for a medium effect size (standardized mean difference, SMD = 0.30) and compared it with the number of observed “positive” datasets. Sensitivity analyses considered small (SMD = 0.20), modest (SMD = 0.50), and large (SMD = 0.80) effect sizes.
Results: A total of 14 meta-analyses with 228 datasets (110 for neurological disorders and 118 for psychiatric disorders) were assessed. For SMD = 0.3, the number of observed “positive” studies (n = 94) was larger than expected (n = 35). We found evidence for an excess of significant findings overall (p < 0.0001) and in 8/14 meta-analyses. Evidence for an excess of significant findings was also observed for SMD = 0.5 for neurological disorders. Of the 228 datasets, 0 (0%), 0 (0%), 3 (1%), and 53 (23%) had a power >0.80, respectively, for SMDs of 0.30, 0.20, 0.50, and 0.80.
Conclusion: Most studies in the rTMS literature are underpowered. This results in fragmentation and waste of research efforts. The somewhat high frequency of “positive” results seems spurious and may reflect bias. Caution is warranted in accepting rTMS as an established treatment for neuropsychiatric conditions.
Keywords: Repetitive transcranial magnetic stimulationRandomized controlled trialMeta-analysisExcess significance
The intensity of polarization on Twitter varies greatly from one country to another
Context matters: political polarization on Twitter from a comparative perspective. Aleksandra Urman. Media, Culture & Society, October 15, 2019. https://doi.org/10.1177/0163443719876541
Abstract: This article explores the issue of political polarization on social media. It shows that the intensity of polarization on Twitter varies greatly from one country to another. The analysis is performed using network-analytic audience duplication approach and is based on the data about the followers of the political parties’ Twitter accounts in 16 democratic countries. Based on the topology of the audience duplication graphs, the political Twitterspheres of the countries are classified as perfectly integrated, integrated, mixed, polarized and perfectly polarized. Explorative analysis shows that polarization is the highest in two-party systems with plurality electoral rules and the lowest in multi-party systems with proportional voting. The findings help explain the discrepancies in the results of previous studies into polarization on social media. The results of the study indicate that extrapolation of the findings from single-case studies on the topic is impossible in most cases, suggesting that more comparative studies on the matter are necessary to better understand the subject and get generalizable results.
Keywords audience duplication, comparative analysis, network analysis, political polarization, social media, Twitter
Abstract: This article explores the issue of political polarization on social media. It shows that the intensity of polarization on Twitter varies greatly from one country to another. The analysis is performed using network-analytic audience duplication approach and is based on the data about the followers of the political parties’ Twitter accounts in 16 democratic countries. Based on the topology of the audience duplication graphs, the political Twitterspheres of the countries are classified as perfectly integrated, integrated, mixed, polarized and perfectly polarized. Explorative analysis shows that polarization is the highest in two-party systems with plurality electoral rules and the lowest in multi-party systems with proportional voting. The findings help explain the discrepancies in the results of previous studies into polarization on social media. The results of the study indicate that extrapolation of the findings from single-case studies on the topic is impossible in most cases, suggesting that more comparative studies on the matter are necessary to better understand the subject and get generalizable results.
Keywords audience duplication, comparative analysis, network analysis, political polarization, social media, Twitter
Beliefs About Human Intelligence in a Sample of Teachers and Non-teachers: There are conflicts between currently accepted intelligence theory and the subjects' beliefs
Warne, Russell T., and Jared Z. Burton. 2019. “Beliefs About Human Intelligence in a Sample of Teachers and Non-teachers.” PsyArXiv. July 17. doi:10.31234/osf.io/uctxp
Abstract: Research in educational psychology consistently finds a relationship between intelligence and academic performance. However, in recent decades, educational fields, including gifted education, have resisted intelligence research, and there are some experts who argue that intelligence tests should not be used in identifying giftedness. Hoping to better understand this resistance to intelligence research, we created a survey of beliefs about intelligence and administered it online to a sample of the general public and a sample of teachers. We found that there are conflicts between currently accepted intelligence theory and beliefs from the American public and teachers, which has important consequences on gifted education, educational policy and the effectiveness of interventions.
Abstract: Research in educational psychology consistently finds a relationship between intelligence and academic performance. However, in recent decades, educational fields, including gifted education, have resisted intelligence research, and there are some experts who argue that intelligence tests should not be used in identifying giftedness. Hoping to better understand this resistance to intelligence research, we created a survey of beliefs about intelligence and administered it online to a sample of the general public and a sample of teachers. We found that there are conflicts between currently accepted intelligence theory and beliefs from the American public and teachers, which has important consequences on gifted education, educational policy and the effectiveness of interventions.
Empathic contagious pain and consolation in laboratory rodents: Rodents also have the ability to feel, recognize, understand and share the other’s distressing states, and allolick injuries and allogroom bodies
Empathic contagious pain and consolation in laboratory rodents: species and sex comparisons. Rui Du et al. bioRxiv, October 15, 2019. https://doi.org/10.1101/745299
Abstract: Laboratory rodents are gregarious in nature and have a feeling of empathy when witnessing a familiar conspecific in pain. The rodent observers express two levels of empathic responses: observational contagious pain (OCP) and consolation. Here we examined the sex and species difference of OCP and consolation in male and female mice and rats. We observed no species difference in both OCP and consolation, but significant species difference in general social (allo-mouth and/or allo-tail sniffing) and non-social (self-grooming) behaviors. For sex difference, male mouse observers showed more allolicking and allogrooming behaviors toward a familiar conspecific in pain during and longer time increase in pain sensitivity after the PDSI than female mouse observers. However, no sex difference was observed in rats. Our results highlighted an evolutionary view of empathy that social animals including rodents also have the ability to feel, recognize, understand and share the other’s distressing states.
Abstract: Laboratory rodents are gregarious in nature and have a feeling of empathy when witnessing a familiar conspecific in pain. The rodent observers express two levels of empathic responses: observational contagious pain (OCP) and consolation. Here we examined the sex and species difference of OCP and consolation in male and female mice and rats. We observed no species difference in both OCP and consolation, but significant species difference in general social (allo-mouth and/or allo-tail sniffing) and non-social (self-grooming) behaviors. For sex difference, male mouse observers showed more allolicking and allogrooming behaviors toward a familiar conspecific in pain during and longer time increase in pain sensitivity after the PDSI than female mouse observers. However, no sex difference was observed in rats. Our results highlighted an evolutionary view of empathy that social animals including rodents also have the ability to feel, recognize, understand and share the other’s distressing states.
Online Vigilance and Affective Well-being in Everyday Life: Thinking about smartphone-mediated social interactions (i.e., the salience dimension of online vigilance) was negatively related to affective well-being
Johannes, Niklas, Adrian Meier, Leonard Reinecke, Saara Ehlert, Dinda N. Setiawan, Nicole Walasek, Tobias Dienlin, et al. 2019. “The Relationship Between Online Vigilance and Affective Well-being in Everyday Life: Combining Smartphone Logging with Experience Sampling.” PsyArXiv. October 15. doi:10.31234/osf.io/t3wc2
Abstract: Through communication technology, users find themselves constantly connected to others to such an extent that they routinely develop a mindset of connectedness. This mindset has been defined as online vigilance. Although there is a large body of research on media use and well-being, the question of how online vigilance impacts well-being remains unanswered. In this preregistered study, we combine experience sampling and smartphone logging to address the relation of online vigilance and affective well-being in everyday life. Seventy-five Android users answered eight daily surveys over five days (N = 1615) whilst having their smartphone use logged. Thinking about smartphone-mediated social interactions (i.e., the salience dimension of online vigilance) was negatively related to affective well-being. However, it was far more important whether those thoughts were positive or negative. No other dimension of online vigilance was robustly related to affective well-being. Taken together, our results suggest that online vigilance does not pose a serious threat to affective well-being in everyday life.
Abstract: Through communication technology, users find themselves constantly connected to others to such an extent that they routinely develop a mindset of connectedness. This mindset has been defined as online vigilance. Although there is a large body of research on media use and well-being, the question of how online vigilance impacts well-being remains unanswered. In this preregistered study, we combine experience sampling and smartphone logging to address the relation of online vigilance and affective well-being in everyday life. Seventy-five Android users answered eight daily surveys over five days (N = 1615) whilst having their smartphone use logged. Thinking about smartphone-mediated social interactions (i.e., the salience dimension of online vigilance) was negatively related to affective well-being. However, it was far more important whether those thoughts were positive or negative. No other dimension of online vigilance was robustly related to affective well-being. Taken together, our results suggest that online vigilance does not pose a serious threat to affective well-being in everyday life.
The pay premium for high‐potential women: They don't report higher levels of pay satisfaction, suggesting that high‐potential women did not perceive their pay premium to be an inequitable advantage
The pay premium for high‐potential women: A constructive replication and refinement. George F. Dreher, Nancy M. Carter, Terry Dworkin. Personnel Psychology, September 10 2019. https://doi.org/10.1111/peps.12357
Abstract> In this constructive replication, we revisit a provocative study by Leslie, Manchester, and Dahm (2017). They found that gender and being designated a high‐potential employee interacted in accounting for pay and that this resulted in a reversal in the commonly observed gender pay gap favoring men. Our primary aim was to examine important boundary conditions associated with their work by (a) conducting a study using a sample that would better generalize across industries and to individuals who aspire to reach senior management, (b) adding critical control variables to the statistical models used in the pay equation, and (c) by introducing a different conceptualization of the high‐potential construct. Also, to better understand the consequences of their study, we considered an additional dependent variable that addressed pay satisfaction. Even after making these model additions, the gender by high‐potential interaction term was significant—ruling out four plausible third‐variable explanations for the Leslie et al. finding. Moreover, these confirming results were observed using a sample that represented individuals employed in a wide range of industries, who had the educational backgrounds, career histories, and motivational states typically required of candidates competing for senior executive roles. Furthermore, high‐potential women did not report higher levels of pay satisfaction, suggesting that high‐potential women did not perceive their pay premium to be an inequitable advantage and that there may be limited positive return associated with using a pay premium to retain high‐potential talent.
Abstract> In this constructive replication, we revisit a provocative study by Leslie, Manchester, and Dahm (2017). They found that gender and being designated a high‐potential employee interacted in accounting for pay and that this resulted in a reversal in the commonly observed gender pay gap favoring men. Our primary aim was to examine important boundary conditions associated with their work by (a) conducting a study using a sample that would better generalize across industries and to individuals who aspire to reach senior management, (b) adding critical control variables to the statistical models used in the pay equation, and (c) by introducing a different conceptualization of the high‐potential construct. Also, to better understand the consequences of their study, we considered an additional dependent variable that addressed pay satisfaction. Even after making these model additions, the gender by high‐potential interaction term was significant—ruling out four plausible third‐variable explanations for the Leslie et al. finding. Moreover, these confirming results were observed using a sample that represented individuals employed in a wide range of industries, who had the educational backgrounds, career histories, and motivational states typically required of candidates competing for senior executive roles. Furthermore, high‐potential women did not report higher levels of pay satisfaction, suggesting that high‐potential women did not perceive their pay premium to be an inequitable advantage and that there may be limited positive return associated with using a pay premium to retain high‐potential talent.
Tuesday, October 15, 2019
Guppies: Brain size affects responsiveness in mating behavior to variation in predation pressure and sex‐ratio
Brain size affects responsiveness in mating behavior to variation in predation pressure and sex‐ratio. Alberto Corral‐López Maksym Romensky Alexander Kotrschal Severine D. Buechel Niclas Kolm. Journal of Evolutionary Biology, October 14 2019. https://doi.org/10.1111/jeb.13556
Abstract: Despite ongoing advances in sexual selection theory, the evolution of mating decisions remains enigmatic. Cognitive processes often require simultaneous processing of multiple sources of information from environmental and social cues. However, little experimental data exist on how cognitive ability affects such fitness‐associated aspects of behavior. Using advanced tracking techniques, we studied mating behaviors of guppies artificially selected for divergence in relative brain size, with known differences in cognitive ability, when predation threat and sex‐ratio was varied. In females, we found a general increase in copulation behavior in when the sex‐ratio was female biased, but only large‐brained females responded with greater willingness to copulate under a low predation threat. In males, we found that small‐brained individuals courted more intensively and displayed more aggressive behaviors than large‐brained individuals. However, there were no differences in female response to males with different brain size. These results provide further evidence of a role for female brain size in optimal decision‐making in a mating context. In addition, our results indicate that brain size may affect mating display skill in male guppies. We suggest that it is important to consider the association between brain size, cognitive ability and sexual behavior when studying how morphological and behavioral traits evolve in wild populations.
7 types of sugaring: Sugar prostitution, compensated dating, compensated companionship, sugar dating, sugar friendships, sugar friendships with benefits, & pragmatic love
“It’s Its Own Thing”: A Typology of Interpersonal Sugar Relationship Scripts. Maren T. Scull. Sociological Perspectives, September 16, 2019. https://doi.org/10.1177/0731121419875115
Abstract: Although academics have focused on sugaring in various parts of the globe, sugar relationships in the United States have largely been ignored. The few studies that address these arrangements in the United States often frame them as a form of prostitution. Drawing from 48 in-depth interviews with women in the United States who have been in sugar relationships, I adopt a connected lives approach to explore the structure of these arrangements and to assess the extent to which they are a form of prostitution. Overall, I found that, although there is a dominant, subcultural relationship script that serves as a blueprint for sugar arrangements, they comprise their own unique relational package and take a variety of forms when enacted on an interpersonal level. Specifically, I identified seven types of sugar relationships, only one of which can be considered prostitution. These included sugar prostitution, compensated dating, compensated companionship, sugar dating, sugar friendships, sugar friendships with benefits, and pragmatic love.
Keywords script theory, prostitution, transactional sex, qualitative methods
Popular version: The 7 types of sugar daddy relationships. Sarah Erickson. Univ of Colorado at Denver, Oct 15 2019. https://www.eurekalert.org/pub_releases/2019-10/uocd-t7t101219.php
Abstract: Although academics have focused on sugaring in various parts of the globe, sugar relationships in the United States have largely been ignored. The few studies that address these arrangements in the United States often frame them as a form of prostitution. Drawing from 48 in-depth interviews with women in the United States who have been in sugar relationships, I adopt a connected lives approach to explore the structure of these arrangements and to assess the extent to which they are a form of prostitution. Overall, I found that, although there is a dominant, subcultural relationship script that serves as a blueprint for sugar arrangements, they comprise their own unique relational package and take a variety of forms when enacted on an interpersonal level. Specifically, I identified seven types of sugar relationships, only one of which can be considered prostitution. These included sugar prostitution, compensated dating, compensated companionship, sugar dating, sugar friendships, sugar friendships with benefits, and pragmatic love.
Keywords script theory, prostitution, transactional sex, qualitative methods
Popular version: The 7 types of sugar daddy relationships. Sarah Erickson. Univ of Colorado at Denver, Oct 15 2019. https://www.eurekalert.org/pub_releases/2019-10/uocd-t7t101219.php
Red-colored male sticklebacks carry more oxidative DNA damage in muscle, testis & sperm in the peak breeding season, but the females find them more attractive
Attractive male sticklebacks carry more oxidative DNA damage in the soma and germline. Sin‐Yeon Kim Alberto Velando. Journal of Evolutionary Biology, October 14 2019. https://doi.org/10.1111/jeb.13552
Abstract: Trade‐offs between the expression of sexual signals and the maintenance of somatic and germline tissues are expected when these depend upon the same resources. Despite the importance of sperm DNA integrity, its trade‐off with sexual signalling has rarely been explored. We experimentally tested the trade‐off between carotenoid‐based sexual colouration and oxidative DNA damage in skeletal muscle, testis and sperm by manipulating reproductive schedule (early vs. late onset of breeding) in male three‐spined sticklebacks. Oxidative DNA damage was measured as the amount of 8‐hydroxy‐2‐deoxy‐Guanosine in genomic DNA. Irrespective of the experimentally manipulated reproductive schedule, individuals investing more in red colouration showed higher levels of oxidative DNA damage in muscle, testis and sperm during the peak breeding season. Our results show that the expression of red colouration traded off against the level of oxidative DNA damage possibly due to the competing functions of carotenoids as colorants and antioxidants. Thus, female sticklebacks may risk fertility and viability of offspring by choosing redder, more deteriorated partners with decreased sperm DNA integrity. The evolution of sexual signal may be constrained by oxidative DNA damage in the soma and germline.
Abstract: Trade‐offs between the expression of sexual signals and the maintenance of somatic and germline tissues are expected when these depend upon the same resources. Despite the importance of sperm DNA integrity, its trade‐off with sexual signalling has rarely been explored. We experimentally tested the trade‐off between carotenoid‐based sexual colouration and oxidative DNA damage in skeletal muscle, testis and sperm by manipulating reproductive schedule (early vs. late onset of breeding) in male three‐spined sticklebacks. Oxidative DNA damage was measured as the amount of 8‐hydroxy‐2‐deoxy‐Guanosine in genomic DNA. Irrespective of the experimentally manipulated reproductive schedule, individuals investing more in red colouration showed higher levels of oxidative DNA damage in muscle, testis and sperm during the peak breeding season. Our results show that the expression of red colouration traded off against the level of oxidative DNA damage possibly due to the competing functions of carotenoids as colorants and antioxidants. Thus, female sticklebacks may risk fertility and viability of offspring by choosing redder, more deteriorated partners with decreased sperm DNA integrity. The evolution of sexual signal may be constrained by oxidative DNA damage in the soma and germline.
The Negative Intelligence–Religiosity Relation: New and Confirming Evidence
The Negative Intelligence–Religiosity Relation: New and Confirming Evidence. Miron Zuckerman at al. Personality and Social Psychology Bulletin, October 15, 2019. https://doi.org/10.1177/0146167219879122
Abstract: Zuckerman et al. (2013) conducted a meta-analysis of 63 studies that showed a negative intelligence–religiosity relation (IRR). As more studies have become available and because some of Zuckerman et al.’s (2013) conclusions have been challenged, we conducted a new meta-analysis with an updated data set of 83 studies. Confirming previous conclusions, the new analysis showed that the correlation between intelligence and religious beliefs in college and noncollege samples ranged from −.20 to −.23. There was no support for mediation of the IRR by education but there was support for partial mediation by analytic cognitive style. Thus, one possible interpretation for the IRR is that intelligent people are more likely to use analytic style (i.e., approach problems more rationally). An alternative (and less interesting) reason for the mediation is that tests of both intelligence and analytic style assess cognitive ability. Additional empirical and theoretical work is needed to resolve this issue.
Keywords intelligence, religiosity, meta-analysis, analytic thinking
Check also The Myth of the Stupid Believer: The Negative Religiousness–IQ Nexus is Not on General Intelligence (g) and is Likely a Product of the Relations Between IQ and Autism Spectrum. Edward Dutton et al. Journal of Religion and Health, Oct 5 2019. https://www.bipartisanalliance.com/2019/10/the-myth-of-stupid-believer-negative.html
Abstract: Zuckerman et al. (2013) conducted a meta-analysis of 63 studies that showed a negative intelligence–religiosity relation (IRR). As more studies have become available and because some of Zuckerman et al.’s (2013) conclusions have been challenged, we conducted a new meta-analysis with an updated data set of 83 studies. Confirming previous conclusions, the new analysis showed that the correlation between intelligence and religious beliefs in college and noncollege samples ranged from −.20 to −.23. There was no support for mediation of the IRR by education but there was support for partial mediation by analytic cognitive style. Thus, one possible interpretation for the IRR is that intelligent people are more likely to use analytic style (i.e., approach problems more rationally). An alternative (and less interesting) reason for the mediation is that tests of both intelligence and analytic style assess cognitive ability. Additional empirical and theoretical work is needed to resolve this issue.
Keywords intelligence, religiosity, meta-analysis, analytic thinking
Check also The Myth of the Stupid Believer: The Negative Religiousness–IQ Nexus is Not on General Intelligence (g) and is Likely a Product of the Relations Between IQ and Autism Spectrum. Edward Dutton et al. Journal of Religion and Health, Oct 5 2019. https://www.bipartisanalliance.com/2019/10/the-myth-of-stupid-believer-negative.html
The size of the stereotype threat effect that can be experienced on tests of cognitive ability in operational (real-world) scenarios such as college admissions tests and employment testing may range from negligible to small
Shewach, O. R., Sackett, P. R., & Quint, S. (2019). Stereotype threat effects in settings with features likely versus unlikely in operational test settings: A meta-analysis. Journal of Applied Psychology, OCt 2019. http://dx.doi.org/10.1037/apl0000420
Abstract: The stereotype threat literature primarily comprises lab studies, many of which involve features that would not be present in high-stakes testing settings. We meta-analyze the effect of stereotype threat on cognitive ability tests, focusing on both laboratory and operational studies with features likely to be present in high stakes settings. First, we examine the features of cognitive ability test metric, stereotype threat cue activation strength, and type of nonthreat control group, and conduct a focal analysis removing conditions that would not be present in high stakes settings. We also take into account a previously unrecognized methodological error in how data are analyzed in studies that control for scores on a prior cognitive ability test, which resulted in a biased estimate of stereotype threat. The focal sample, restricting the database to samples utilizing operational testing-relevant conditions, displayed a threat effect of d = −.14 (k = 45, N = 3,532, SDδ = .31). Second, we present a comprehensive meta-analysis of stereotype threat. Third, we examine a small subset of studies in operational test settings and studies utilizing motivational incentives, which yielded d-values ranging from .00 to −.14. Fourth, the meta-analytic database is subjected to tests of publication bias, finding nontrivial evidence for publication bias. Overall, results indicate that the size of the stereotype threat effect that can be experienced on tests of cognitive ability in operational scenarios such as college admissions tests and employment testing may range from negligible to small.
Abstract: The stereotype threat literature primarily comprises lab studies, many of which involve features that would not be present in high-stakes testing settings. We meta-analyze the effect of stereotype threat on cognitive ability tests, focusing on both laboratory and operational studies with features likely to be present in high stakes settings. First, we examine the features of cognitive ability test metric, stereotype threat cue activation strength, and type of nonthreat control group, and conduct a focal analysis removing conditions that would not be present in high stakes settings. We also take into account a previously unrecognized methodological error in how data are analyzed in studies that control for scores on a prior cognitive ability test, which resulted in a biased estimate of stereotype threat. The focal sample, restricting the database to samples utilizing operational testing-relevant conditions, displayed a threat effect of d = −.14 (k = 45, N = 3,532, SDδ = .31). Second, we present a comprehensive meta-analysis of stereotype threat. Third, we examine a small subset of studies in operational test settings and studies utilizing motivational incentives, which yielded d-values ranging from .00 to −.14. Fourth, the meta-analytic database is subjected to tests of publication bias, finding nontrivial evidence for publication bias. Overall, results indicate that the size of the stereotype threat effect that can be experienced on tests of cognitive ability in operational scenarios such as college admissions tests and employment testing may range from negligible to small.
Lifespan Cognitive Reserve—A Secret to Coping With Neurodegenerative Pathology
Lifespan Cognitive Reserve—A Secret to Coping With Neurodegenerative Pathology. Sylvia Villeneuve. JAMA Neurol. 2019;76(10):1145-1146. October 2019, doi:10.1001/jamaneurol.2019.2899
Given the limited success of therapeutic interventions for Alzheimer disease, there is increased interest in understanding whether modifiable factors can help cope with or postpone the appearance of brain pathology. It is estimated that about 35% of Alzheimer risk is modifiable.1,2 Epidemiologic studies have shown that lifetime exposures to higher education, higher occupational attainment, and cognitively stimulating activities are associated with reduced risk of Alzheimer dementia.3 Autopsy studies have shown interindividual differences in the amount of brain pathology people can tolerate before manifesting cognitive impairments, and autopsied brains of about one-third of individuals who are cognitively normal meet neuropathological criteria for Alzheimer disease.4 About a decade ago, the concept of cognitive reserve was proposed to account for the discrepancy between brain pathology and cognitive status.5 The broad hypothesis was that individuals with enriched lifelong exposures would be able to better tolerate with brain pathology in late life. Many studies have investigated how one can cope with brain damage using proxies of neurodegeneration or synaptic integrity. However, the gold standard for testing the reserve hypothesis is the direct measurement of brain pathology at autopsy.
In this issue of JAMA Neurology, Xu and colleagues6 used the resources of the Rush Memory and Aging Project for an empirical test of the reserve hypothesis. They first assessed whether individuals with lifelong protective factors have a reduced risk of dementia. More importantly, they examined the influence of Alzheimer pathology on this association, thereby providing a direct test of the reserve hypothesis. The Rush Memory and Aging Project is among the largest longitudinal studies that include both comprehensive in vivo and autopsy data. The study had 1602 participants without dementia, of whom 611 had died during the study follow-up period and had autopsy data available. Of interest, the authors6 derived the reserve score by combining weighted measures of education, lifelong cognitive activity, and late-life social activity. In theory, the use of a composite score as a proxy for reserve instead of a single factor score (eg, education) should be more accurate if reserve is built on diverse lifelong experiences, as is commonly thought. Over a mean 6-year study follow-up (range, 1-20 years), one-quarter of these participants developed dementia. As expected, a higher reserve composite was associated with reduced incidence of dementia. Thus, participants in the highest tertile of reserve had about a 40% reduction in occurrence of dementia compared with those in the lowest tertile. A dose-effect association was also evident, in that individuals in the middle tertile showed about a 20% reduction in risk of dementia. Stated another way, individuals with high reserve scores experienced a delay in dementia onset of more than 7 years. Targeting lifestyle factors that enhance reserve could therefore reduce the incidence of new cases, while providing more than half a decade of cognitive health to individuals who would ultimately develop dementia.
The contribution of individual factors to dementia risk was assessed in a secondary set of analyses. Here it was interesting to note that lifestyle behaviors in midlife and late life were the main protective factors of dementia, suggesting that preventive interventions should probably be started in midlife but may also be successful if started in late life. In 2015 the results of Finnish Geriatric Intervention Study to Prevent Cognitive Impairment and Disability (FINGER),7 which involved physical exercise, nutrition, cognitive stimulation, and self-monitoring of heart health, suggested the possibility of preventing cognitive decline using a multidomain intervention among older individuals who were at risk. The FINGER model is now being adapted and implemented in North America (the Alzheimer's Association US Study to Protect Brain Health Through Lifestyle Intervention to Reduce Risk [US POINTER]), Asia (Singapore Intervention Study to Prevent Cognitive Impairment and Disability [SINGER] and the Multimodal Interventions to Delay Dementia and Disability in Rural China [MIND-CHINA]), Australia (the Maintain Your Brain [MYB] trial), and Europe (Multimodal Preventive Trial for Alzheimer’s Disease [MIND-AD]). It is estimated that a multidomain lifestyle intervention that achieves a 25% reduction in Alzheimer risk could prevent more than 3 million cases worldwide.1
Focusing on the autopsy findings, the positive association of reserve with reducing the risk of dementia onset was present even in individuals with high degrees of brain pathology. This was true for Alzheimer disease pathology, but also for gross infarcts and microscopic infarcts. No sex difference was reported for this association. A recent in vivo positron emission tomography (PET) study found that women exhibit higher tau burden than men in the preclinical phase of the disease,8 which could suggest that women can tolerate more pathology before developing cognitive impairments. Since women are at increased risk of developing Alzheimer dementia, particularly in late life, the role of sex differences in lifestyle-protective factors and associated resilience to the pathology will need further investigation.
The protective effect of lifespan cognitive reserve on the incidence of dementia in individuals with high pathology was also found when removing individuals with baseline mild cognitive impairments from the analysis. This last point is important, because individuals who are cognitively normal are the targets of preventive trials and therefore increasing effort is devoted toward a better understating of the factors that could protect this group from developing cognitive impairment. Additionally, it is possible that some reserve-associated phenomena are only quantifiable at a certain stage of the disease, stressing the need to explore reserve-associated phenomena at different disease stages. For instance, a new study9 has shown that, while high reserve was associated with a slower cognitive decline in individuals without dementia, it was associated with a faster decline in individuals with dementia. One interpretation for this unexpected finding is that when individuals with high reserve can no longer cope with the pathology, their clinical decline is faster than individuals with lower reserve because the disease stage is more advanced.5
Evidence from PET research further suggests that some lifestyle factors that can enhance cognitive reserve might also postpone manifestation of brain pathology. This new concept has been called resistance.10 Thus, while the term reserve refers to the ability to cope with AD pathology, resistance refers to the ability to avoid the pathology in the first place. For instance, enriched cognitive and physical activities have been associated with a reduced amyloid burden in individuals who are cognitively normal.11-13 It has even been proposed that protective lifestyle factors could partially offset the negative influence of the APOE ε4 allele on the risk of developing Alzheimer pathology.14,15 In their study, Xu and colleagues6 found no evidence supporting the resistance hypothesis, with individuals with high and low reserve scores exhibiting similar amounts of pathology. The results were similar when removing individuals with baseline mild cognitive impairment from the analysis or using cognitive status as an interactive term. The association between protective lifestyle factors and brain pathology, if it exists, might not be linear but rather closer to a U-shaped distribution and therefore extremely difficult to detect in cross-sectional studies. Plausibly, as soon as individuals with high reserve develop Alzheimer pathology, they are no longer resisting the pathology but are limited to coping with it.
This last point may serve as a reminder that cognitive reserve is a dynamic and therefore complex concept for study. Amyloid and tau PET imaging, even if they only give estimates of pathology, should be extremely powerful tools to quantify reserve-associated processes in real time, since they allow tracking of evolution of pathology in vivo. They can also contribute to simultaneous measurement of protective lifestyle factors, Alzheimer pathology, and cognitive performance. Even more interestingly, PET imaging can permit concurrent quantitation of the effects of preventive lifestyle interventions on Alzheimer pathology and associated cognitive decline.
The article by Dr Xu and colleagues6 provides strong evidence that individuals with higher reserve can tolerate more brain pathology and therefore experienced a reduced risk of dementia compared with others. While autopsy studies will probably remain the gold standard for testing the reserve hypothesis, their limitations in testing the dynamic of this phenomenon are obvious. New fluid, PET, and magnetic resonance imaging techniques for tracking Alzheimer and vascular pathology in vivo will certainly add to comprehension of reserve phenomena in the years to come.
Check also Association of Lifespan Cognitive Reserve Indicator With Dementia Risk in the Presence of Brain Pathologies. Hui Xu et al. JAMA Neurol. 2019;76(10):1184-1191, July 14, 2019, doi:10.1001/jamaneurol.2019.2455
Given the limited success of therapeutic interventions for Alzheimer disease, there is increased interest in understanding whether modifiable factors can help cope with or postpone the appearance of brain pathology. It is estimated that about 35% of Alzheimer risk is modifiable.1,2 Epidemiologic studies have shown that lifetime exposures to higher education, higher occupational attainment, and cognitively stimulating activities are associated with reduced risk of Alzheimer dementia.3 Autopsy studies have shown interindividual differences in the amount of brain pathology people can tolerate before manifesting cognitive impairments, and autopsied brains of about one-third of individuals who are cognitively normal meet neuropathological criteria for Alzheimer disease.4 About a decade ago, the concept of cognitive reserve was proposed to account for the discrepancy between brain pathology and cognitive status.5 The broad hypothesis was that individuals with enriched lifelong exposures would be able to better tolerate with brain pathology in late life. Many studies have investigated how one can cope with brain damage using proxies of neurodegeneration or synaptic integrity. However, the gold standard for testing the reserve hypothesis is the direct measurement of brain pathology at autopsy.
In this issue of JAMA Neurology, Xu and colleagues6 used the resources of the Rush Memory and Aging Project for an empirical test of the reserve hypothesis. They first assessed whether individuals with lifelong protective factors have a reduced risk of dementia. More importantly, they examined the influence of Alzheimer pathology on this association, thereby providing a direct test of the reserve hypothesis. The Rush Memory and Aging Project is among the largest longitudinal studies that include both comprehensive in vivo and autopsy data. The study had 1602 participants without dementia, of whom 611 had died during the study follow-up period and had autopsy data available. Of interest, the authors6 derived the reserve score by combining weighted measures of education, lifelong cognitive activity, and late-life social activity. In theory, the use of a composite score as a proxy for reserve instead of a single factor score (eg, education) should be more accurate if reserve is built on diverse lifelong experiences, as is commonly thought. Over a mean 6-year study follow-up (range, 1-20 years), one-quarter of these participants developed dementia. As expected, a higher reserve composite was associated with reduced incidence of dementia. Thus, participants in the highest tertile of reserve had about a 40% reduction in occurrence of dementia compared with those in the lowest tertile. A dose-effect association was also evident, in that individuals in the middle tertile showed about a 20% reduction in risk of dementia. Stated another way, individuals with high reserve scores experienced a delay in dementia onset of more than 7 years. Targeting lifestyle factors that enhance reserve could therefore reduce the incidence of new cases, while providing more than half a decade of cognitive health to individuals who would ultimately develop dementia.
The contribution of individual factors to dementia risk was assessed in a secondary set of analyses. Here it was interesting to note that lifestyle behaviors in midlife and late life were the main protective factors of dementia, suggesting that preventive interventions should probably be started in midlife but may also be successful if started in late life. In 2015 the results of Finnish Geriatric Intervention Study to Prevent Cognitive Impairment and Disability (FINGER),7 which involved physical exercise, nutrition, cognitive stimulation, and self-monitoring of heart health, suggested the possibility of preventing cognitive decline using a multidomain intervention among older individuals who were at risk. The FINGER model is now being adapted and implemented in North America (the Alzheimer's Association US Study to Protect Brain Health Through Lifestyle Intervention to Reduce Risk [US POINTER]), Asia (Singapore Intervention Study to Prevent Cognitive Impairment and Disability [SINGER] and the Multimodal Interventions to Delay Dementia and Disability in Rural China [MIND-CHINA]), Australia (the Maintain Your Brain [MYB] trial), and Europe (Multimodal Preventive Trial for Alzheimer’s Disease [MIND-AD]). It is estimated that a multidomain lifestyle intervention that achieves a 25% reduction in Alzheimer risk could prevent more than 3 million cases worldwide.1
Focusing on the autopsy findings, the positive association of reserve with reducing the risk of dementia onset was present even in individuals with high degrees of brain pathology. This was true for Alzheimer disease pathology, but also for gross infarcts and microscopic infarcts. No sex difference was reported for this association. A recent in vivo positron emission tomography (PET) study found that women exhibit higher tau burden than men in the preclinical phase of the disease,8 which could suggest that women can tolerate more pathology before developing cognitive impairments. Since women are at increased risk of developing Alzheimer dementia, particularly in late life, the role of sex differences in lifestyle-protective factors and associated resilience to the pathology will need further investigation.
The protective effect of lifespan cognitive reserve on the incidence of dementia in individuals with high pathology was also found when removing individuals with baseline mild cognitive impairments from the analysis. This last point is important, because individuals who are cognitively normal are the targets of preventive trials and therefore increasing effort is devoted toward a better understating of the factors that could protect this group from developing cognitive impairment. Additionally, it is possible that some reserve-associated phenomena are only quantifiable at a certain stage of the disease, stressing the need to explore reserve-associated phenomena at different disease stages. For instance, a new study9 has shown that, while high reserve was associated with a slower cognitive decline in individuals without dementia, it was associated with a faster decline in individuals with dementia. One interpretation for this unexpected finding is that when individuals with high reserve can no longer cope with the pathology, their clinical decline is faster than individuals with lower reserve because the disease stage is more advanced.5
Evidence from PET research further suggests that some lifestyle factors that can enhance cognitive reserve might also postpone manifestation of brain pathology. This new concept has been called resistance.10 Thus, while the term reserve refers to the ability to cope with AD pathology, resistance refers to the ability to avoid the pathology in the first place. For instance, enriched cognitive and physical activities have been associated with a reduced amyloid burden in individuals who are cognitively normal.11-13 It has even been proposed that protective lifestyle factors could partially offset the negative influence of the APOE ε4 allele on the risk of developing Alzheimer pathology.14,15 In their study, Xu and colleagues6 found no evidence supporting the resistance hypothesis, with individuals with high and low reserve scores exhibiting similar amounts of pathology. The results were similar when removing individuals with baseline mild cognitive impairment from the analysis or using cognitive status as an interactive term. The association between protective lifestyle factors and brain pathology, if it exists, might not be linear but rather closer to a U-shaped distribution and therefore extremely difficult to detect in cross-sectional studies. Plausibly, as soon as individuals with high reserve develop Alzheimer pathology, they are no longer resisting the pathology but are limited to coping with it.
This last point may serve as a reminder that cognitive reserve is a dynamic and therefore complex concept for study. Amyloid and tau PET imaging, even if they only give estimates of pathology, should be extremely powerful tools to quantify reserve-associated processes in real time, since they allow tracking of evolution of pathology in vivo. They can also contribute to simultaneous measurement of protective lifestyle factors, Alzheimer pathology, and cognitive performance. Even more interestingly, PET imaging can permit concurrent quantitation of the effects of preventive lifestyle interventions on Alzheimer pathology and associated cognitive decline.
The article by Dr Xu and colleagues6 provides strong evidence that individuals with higher reserve can tolerate more brain pathology and therefore experienced a reduced risk of dementia compared with others. While autopsy studies will probably remain the gold standard for testing the reserve hypothesis, their limitations in testing the dynamic of this phenomenon are obvious. New fluid, PET, and magnetic resonance imaging techniques for tracking Alzheimer and vascular pathology in vivo will certainly add to comprehension of reserve phenomena in the years to come.
Check also Association of Lifespan Cognitive Reserve Indicator With Dementia Risk in the Presence of Brain Pathologies. Hui Xu et al. JAMA Neurol. 2019;76(10):1184-1191, July 14, 2019, doi:10.1001/jamaneurol.2019.2455
Key Points
Question Is high lifespan cognitive reserve (CR) indicator associated with a reduction in dementia risk, and how strong is this association in the presence of high brain pathologies?
Findings In this cohort study including 1602 dementia-free older adults, high lifespan CR was associated with a decreased risk of dementia. This association was present in people with high Alzheimer disease and vascular pathologies.
Meaning Accumulative educational and mentally stimulating activities enhancing CR throughout life might be a feasible strategy to prevent dementia, even for people with high brain pathologies.
Abstract
Importance Evidence on the association of lifespan cognitive reserve (CR) with dementia is limited, and the strength of this association in the presence of brain pathologies is unknown.
Objective To examine the association of lifespan CR with dementia risk, taking brain pathologies into account.
Design, Setting, and Participants This study used data from 2022 participants in the Rush Memory and Aging Project, an ongoing community-based cohort study with annual follow-up from 1997 to 2018 (mean follow-up, 6 years; maximum follow-up, 20 years). After excluding 420 individuals who had prevalent dementia, missing data on CR, or dropped out, 1602 dementia-free adults were identified at baseline and evaluated to detect incident dementia. During follow-up, 611 died and underwent autopsies. Data were analyzed from May to September 2018.
Exposures Information on CR factors (education; early-life, midlife, and late-life cognitive activities; and social activities in late life) was obtained at baseline. Based on these factors, lifespan CR scores were captured using a latent variable from a structural equation model and was divided into tertiles (lowest, middle, and highest).
Main Outcomes and Measures Dementia was diagnosed following international criteria. Neuropathologic evaluations for Alzheimer disease and other brain pathologies were performed in autopsied participants. The association of lifespan CR with dementia or brain pathologies was estimated using Cox regression models or logistic regression.
Results Of the 1602 included participants, 1216 (75.9%) were women, and the mean (SD) age was 79.6 (7.5) years. During follow-up, 386 participants developed dementia (24.1%), including 357 participants with Alzheimer disease–related dementia (22.3%). The multiadjusted hazards ratios (HRs) of dementia were 0.77 (95% CI, 0.59-0.99) for participants in the middle CR score tertile and 0.61 (95% CI, 0.47-0.81) for those in the highest CR score tertile compared with those in the lowest CR score tertile. In autopsied participants, CR was not associated with most brain pathologies, and the association of CR with dementia remained significant after additional adjustment for brain pathologies (HR, 0.60; 95% CI, 0.42-0.86). The highest CR score tertile was associated with a reduction in dementia risk, even among participants with high Alzheimer disease pathology (HR, 0.57; 95% CI, 0.37-0.87) and any gross infarcts (HR, 0.34; 95% CI, 0.18-0.62).
Conclusions and Relevance High lifespan CR is associated with a reduction in dementia risk, even in the presence of high brain pathologies. Our findings highlight the importance of lifespan CR accumulation in dementia prevention.
The dolichofacial individual was associated with security agents as the most prone to commit crimes, was seen as less trustworthy
Does the facial pattern give individuals a profile of a crime suspect? Nathalia de Lima SANTOS et al. Biosci. J., Uberlândia, v. 35, n. 5, p. 1614-1621, Sep./Oct. 2019. http://dx.doi.org/10.14393/BJ-v35n5a2019-46326
ABSTRACT: To evaluate the influence of mesofacial, brachyfacial and dolichofacial facial patterns on giving an individual the profile of a crime suspect in the eyes of public security agents. This study had a crosssectional design, conducted with public security agents of both sexes (n=100), where images of facial composites (police sketches) of individuals with different facial patterns (mesofacial, brachyfacial and dolichofacial) were used. With these images in hand, a questionnaire was created, divided into three parts: the first in which all the images were presented together, allowing comparison among them; the second, in which each image was evaluated separately followed by questions and the third that consisted on a visual analog scale that presented a bar with marks going from 0 to 100, where 0 represented the untrustworthy individual, 50 the individual who could be trusted, and 100 a very trustworthy individual. When all the data had been obtained statistical analyses were performed using the Chi-square and Friedman tests. The level of significance adopted was 5% (α=0.05). The dolichofacial individual was associated with security agents as the most prone to commit crimes and became more insecure and distrustful when compared to the mesofacial and brachyfacial individuals (p <0.001). The dolichofacial profile had a negative influence on the judgment of security agents who attributed to it, a character suspected of a crime and a low level of trustworthiness.
KEYWORDS: Face. Social perception. Judgment. Crime.
ABSTRACT: To evaluate the influence of mesofacial, brachyfacial and dolichofacial facial patterns on giving an individual the profile of a crime suspect in the eyes of public security agents. This study had a crosssectional design, conducted with public security agents of both sexes (n=100), where images of facial composites (police sketches) of individuals with different facial patterns (mesofacial, brachyfacial and dolichofacial) were used. With these images in hand, a questionnaire was created, divided into three parts: the first in which all the images were presented together, allowing comparison among them; the second, in which each image was evaluated separately followed by questions and the third that consisted on a visual analog scale that presented a bar with marks going from 0 to 100, where 0 represented the untrustworthy individual, 50 the individual who could be trusted, and 100 a very trustworthy individual. When all the data had been obtained statistical analyses were performed using the Chi-square and Friedman tests. The level of significance adopted was 5% (α=0.05). The dolichofacial individual was associated with security agents as the most prone to commit crimes and became more insecure and distrustful when compared to the mesofacial and brachyfacial individuals (p <0.001). The dolichofacial profile had a negative influence on the judgment of security agents who attributed to it, a character suspected of a crime and a low level of trustworthiness.
KEYWORDS: Face. Social perception. Judgment. Crime.
Monday, October 14, 2019
The Rare Sides of Twin Research: Representation of Self-Image; Twins With Kleine–Levin Syndrome; Heteropaternal Lemur Twins; Risk of Dental Caries/In the Media
The Rare Sides of Twin Research: Important to Remember/Twin Research Reviews: Representation of Self-Image; Twins With Kleine–Levin Syndrome; Heteropaternal Lemur Twins; Risk of Dental Caries/In the Media: High-Society Models; ‘Winkelevii’ Super Bowl Twins; Multiple Birth × Three; Twin Sister Surrogate; A Presidential Twin? Nancy L. Segal. Twin Research and Human Genetics, October 14 2019. https://doi.org/10.1017/thg.2019.84
Abstract: This article explores some rare sides of twin research. The focus of this article is the sad plight of the Dionne quintuplets, born in Canada in 1934. However, several other studies belong in this category, such as Dr Josef Mengele’s horrifying twin research conducted at the Auschwitz concentration camp, Dr John Money’s misguided attempt to turn an accidentally castrated male twin into a female, Russian scientists’ cruel medical study of conjoined female twins and Dr Peter Neubauer’s secret project that tracked the development of separated twins. Reviews of current twin research span twins’ representation of self-image, twins with Kleine–Levin Syndrome, heteropaternal twinning in lemurs and factors affecting risk of dental caries. Media coverage includes a pair of high-society models, a book about the ‘Winkelevii’ twins, Super Bowl twin teammates, a family with three sets of fraternal twins, a twin sister surrogate and a near presidential twin.
Abstract: This article explores some rare sides of twin research. The focus of this article is the sad plight of the Dionne quintuplets, born in Canada in 1934. However, several other studies belong in this category, such as Dr Josef Mengele’s horrifying twin research conducted at the Auschwitz concentration camp, Dr John Money’s misguided attempt to turn an accidentally castrated male twin into a female, Russian scientists’ cruel medical study of conjoined female twins and Dr Peter Neubauer’s secret project that tracked the development of separated twins. Reviews of current twin research span twins’ representation of self-image, twins with Kleine–Levin Syndrome, heteropaternal twinning in lemurs and factors affecting risk of dental caries. Media coverage includes a pair of high-society models, a book about the ‘Winkelevii’ twins, Super Bowl twin teammates, a family with three sets of fraternal twins, a twin sister surrogate and a near presidential twin.
Male Qualities and Likelihood of Orgasm in Women: Consistent findings regarding males who elicit orgasm more frequently pertain to their sexual behavior (being attentive, patient, & receptive to instruction) rather than their traits
Male Qualities and Likelihood of Orgasm. James M. Sherlock and Morgan J. Sidari. In T.K. Shackelford, V.A. Weekes-Shackelford (eds.), Springer Encyclopedia of Evolutionary Psychological Science, 2020, https://doi.org/10.1007/978-3-319-16999-6_278-1
Definition
Male traits and sexual behaviors that predict the likelihood of female orgasm.
Introduction
In contrast to male orgasm, the female orgasm is enormously variable in frequency during penetrative sex. While men almost always ejaculate during penile-vaginal intercourse, women orgasm far less frequently from penetrative sex alone and experience significant variation in orgasm frequency with different partners (Lloyd 2005). While variation in women.s orgasm frequency is poorly understood, evolutionary psychology tends to view variation in behavior as adaptive and responsive to environmental conditions. Current evolutionary theories regarding the female orgasm can be broadly divided into two positions: those that focus on selection on male sexual function and those that focus on selection on female sexual function. The by-product hypothesis concerns the former and posits that the capacity of women to experience orgasm is a consequence of strong selection pressure on males' capacity to reach orgasm. This position is based on similarities between male and female orgasm as well as the observation that the male glans penis and female clitoris arise from homologous tissue during development. In contrast, the mate choice hypothesis argues that variation in female orgasm frequency during sexual intercourse is reflective of varying quality of their male partners. [...]
Mate-Choice Hypotheses of Female Orgasm
Given the high gestational cost of rearing human offspring women should be driven to select mates of high quality. Under the sire-choice hypothesis, quality tends to be defined as the ability to pass favorable genetic traits onto offspring (i.e., heritable traits) that will contribute to offspring fitness. In contrast, the pair-bonding hypothesis postulates that male traits that are likely to benefit the woman through increased care and investment in offspring ought to promote orgasm. Traits that have been identified as theoretically important to each theory can be seen in Table 1 below.
Male Qualities and Likelihood of Orgasm, Table 1
Partner traits distinguishing mate-choice hypotheses (For review see Sherlock et al. 2016)
> Sire choice
Physical attractiveness - Height - Athleticism - Muscularity - Voice depth - Physical fitness - Humor - Creativity - Dominance - Body odor pleasantness
> Pair-bonding
Faithfulness - Warmth - Earning potential - Kindness
Mate-Choice Traits that Predict Orgasm Likelihood
Sire-Choice
The sire-choice hypothesis has been more extensively tested than the pair-bonding hypothesis and some evidence does suggest that women may be more likely to orgasm when their partner possesses traits putatively associated with genetic quality. [...]
Pair-Bonding
Few studies to date have thoroughly investigated pair-bonding traits in the context of female orgasm frequency. However, Costa and Brody (2007) have observed that greater orgasm frequency is associated with overall relationship quality. [...]
Alternative Considerations
While some male traits have been reported to covary with orgasm frequency, this may not represent a causal relationship. Firstly, it is possible that women who orgasm more frequently may be more likely to select particular types of partners than women who orgasm infrequently. For example, women who report a higher orgasm frequency may choose to have sex with more physically attractive men. These men are likely to have more experience in short-term sexual relationships and therefore may be more effective at eliciting orgasm in their partners. [...]
Only one study to date has investigated how women.s orgasm frequencies have varied with different sexual partners. Sherlock et al. (2016) had single women with more than two male sexual partners report on a range of characteristics of the man with whom orgasm was the easiest and the man with whom orgasm was the most difficult (or did not occur at all). By comparing high- and low-orgasm men, several traits emerged as important in predicting ease of orgasm. High-orgasm men tended to be higher in humor, attractiveness, creativity, emotional warmth, faithfulness, and body odor pleasantness, consistent with both sire-choice and pair-bonding hypotheses.
Male Sexual Behavior
Importantly, Sherlock et al. (2016) also observed that a number of sexual behaviors differed between high- and low-orgasm males. Specifically, high-orgasm males were more likely to be focused on their partner.s pleasure, engage in oral sex, use sex toys, spend more time on foreplay, and stimulate their partner's clitoris during sex. Women were also more likely to communicate their sexual position preferences and stimulate their clitoris when having sex with high-orgasm partners. [...] [...] Across all three studies, orgasm was more likely to occur with manual stimulation of the clitoris during intercourse (Frederick et al. 2017; Richters et al. 2006; Sherlock et al. 2016). Consequently, any contribution of male traits (e.g., attractiveness) to female orgasm needs to be considered in the context of variation in male sexual behavior. Further complicating these results is the likely association between male traits and sexual behaviors. [...]
Conclusion
Despite some consistency in male traits associated with orgasm across several studies, there is reason to be cautious interpreting these results without first accounting for male sexual behavior. The most prominent trait associated with increases in the likelihood of female orgasm is attractiveness (Andersson 1994; Gallup et al. 2014; Grammer et al. 2003; Shackelford et al. 2000; Sherlock et al. 2016), yet the causal pathway between attractiveness and female orgasm could be inverse to the current theorizing. That is, women could come to view their partners as more attractive if they are more frequent benefactors of orgasms. In sum, the most consistent findings regarding males who elicit orgasm more frequently pertain to their sexual behavior rather than their traits. Women are more likely to achieve orgasm with a partner who is attentive, patient, and receptive to instruction.
Female Mate Choice; Mate selection; Orgasm; Pair-bonding; The Evolution of Genitalia
Definition
Male traits and sexual behaviors that predict the likelihood of female orgasm.
Introduction
In contrast to male orgasm, the female orgasm is enormously variable in frequency during penetrative sex. While men almost always ejaculate during penile-vaginal intercourse, women orgasm far less frequently from penetrative sex alone and experience significant variation in orgasm frequency with different partners (Lloyd 2005). While variation in women.s orgasm frequency is poorly understood, evolutionary psychology tends to view variation in behavior as adaptive and responsive to environmental conditions. Current evolutionary theories regarding the female orgasm can be broadly divided into two positions: those that focus on selection on male sexual function and those that focus on selection on female sexual function. The by-product hypothesis concerns the former and posits that the capacity of women to experience orgasm is a consequence of strong selection pressure on males' capacity to reach orgasm. This position is based on similarities between male and female orgasm as well as the observation that the male glans penis and female clitoris arise from homologous tissue during development. In contrast, the mate choice hypothesis argues that variation in female orgasm frequency during sexual intercourse is reflective of varying quality of their male partners. [...]
Mate-Choice Hypotheses of Female Orgasm
Given the high gestational cost of rearing human offspring women should be driven to select mates of high quality. Under the sire-choice hypothesis, quality tends to be defined as the ability to pass favorable genetic traits onto offspring (i.e., heritable traits) that will contribute to offspring fitness. In contrast, the pair-bonding hypothesis postulates that male traits that are likely to benefit the woman through increased care and investment in offspring ought to promote orgasm. Traits that have been identified as theoretically important to each theory can be seen in Table 1 below.
Male Qualities and Likelihood of Orgasm, Table 1
Partner traits distinguishing mate-choice hypotheses (For review see Sherlock et al. 2016)
> Sire choice
Physical attractiveness - Height - Athleticism - Muscularity - Voice depth - Physical fitness - Humor - Creativity - Dominance - Body odor pleasantness
> Pair-bonding
Faithfulness - Warmth - Earning potential - Kindness
Mate-Choice Traits that Predict Orgasm Likelihood
Sire-Choice
The sire-choice hypothesis has been more extensively tested than the pair-bonding hypothesis and some evidence does suggest that women may be more likely to orgasm when their partner possesses traits putatively associated with genetic quality. [...]
Pair-Bonding
Few studies to date have thoroughly investigated pair-bonding traits in the context of female orgasm frequency. However, Costa and Brody (2007) have observed that greater orgasm frequency is associated with overall relationship quality. [...]
Alternative Considerations
While some male traits have been reported to covary with orgasm frequency, this may not represent a causal relationship. Firstly, it is possible that women who orgasm more frequently may be more likely to select particular types of partners than women who orgasm infrequently. For example, women who report a higher orgasm frequency may choose to have sex with more physically attractive men. These men are likely to have more experience in short-term sexual relationships and therefore may be more effective at eliciting orgasm in their partners. [...]
Only one study to date has investigated how women.s orgasm frequencies have varied with different sexual partners. Sherlock et al. (2016) had single women with more than two male sexual partners report on a range of characteristics of the man with whom orgasm was the easiest and the man with whom orgasm was the most difficult (or did not occur at all). By comparing high- and low-orgasm men, several traits emerged as important in predicting ease of orgasm. High-orgasm men tended to be higher in humor, attractiveness, creativity, emotional warmth, faithfulness, and body odor pleasantness, consistent with both sire-choice and pair-bonding hypotheses.
Male Sexual Behavior
Importantly, Sherlock et al. (2016) also observed that a number of sexual behaviors differed between high- and low-orgasm males. Specifically, high-orgasm males were more likely to be focused on their partner.s pleasure, engage in oral sex, use sex toys, spend more time on foreplay, and stimulate their partner's clitoris during sex. Women were also more likely to communicate their sexual position preferences and stimulate their clitoris when having sex with high-orgasm partners. [...] [...] Across all three studies, orgasm was more likely to occur with manual stimulation of the clitoris during intercourse (Frederick et al. 2017; Richters et al. 2006; Sherlock et al. 2016). Consequently, any contribution of male traits (e.g., attractiveness) to female orgasm needs to be considered in the context of variation in male sexual behavior. Further complicating these results is the likely association between male traits and sexual behaviors. [...]
Conclusion
Despite some consistency in male traits associated with orgasm across several studies, there is reason to be cautious interpreting these results without first accounting for male sexual behavior. The most prominent trait associated with increases in the likelihood of female orgasm is attractiveness (Andersson 1994; Gallup et al. 2014; Grammer et al. 2003; Shackelford et al. 2000; Sherlock et al. 2016), yet the causal pathway between attractiveness and female orgasm could be inverse to the current theorizing. That is, women could come to view their partners as more attractive if they are more frequent benefactors of orgasms. In sum, the most consistent findings regarding males who elicit orgasm more frequently pertain to their sexual behavior rather than their traits. Women are more likely to achieve orgasm with a partner who is attentive, patient, and receptive to instruction.
Female Mate Choice; Mate selection; Orgasm; Pair-bonding; The Evolution of Genitalia
Effects of the over-the-counter pain reliever acetaminophen, which can alter consumers’ emotional experiences and their economic behavior well beyond soothing their aches and pains, has also memory effects
Drug influences on consumer judgments: emerging insights and research opportunities from the intersection of pharmacology and psychology. Geoffrey R. O. Durso, Kelly L. Haws, Baldwin M. Way. Marketing Letters, October 10 2019. https://link.springer.com/article/10.1007/s11002-019-09500-z
Abstract: Recent evidence at the intersection of pharmacology and psychology suggests that pharmaceutical products and other drugs can exert previously unrecognized effects on consumers’ judgments, emotions, and behavior. We highlight the importance of a wider perspective for marketing science by proposing novel questions about how drugs might influence consumers. As a model for this framework, we review recently discovered effects of the over-the-counter pain reliever acetaminophen, which can alter consumers’ emotional experiences and their economic behavior well beyond soothing their aches and pains, and also present novel data on its memory effects. Observing effects of putatively benign over-the-counter medicines that extend beyond their originally approved usages suggests that many other drugs are also likely to influence processes relevant for consumers. The ubiquity of drug consumption—medical or recreational, legal or otherwise—underscores the importance of considering several novel research directions for understanding pharmacological-psychological interactions on consumer judgments, emotions, and behaviors.
Keywords: Decision making Emotion Memory Pharmaceuticals Substances Acetaminophen
Check also Conference Talk—SMPC 2019: Effect of Acetaminophen on Emotional Sounds. Lindsay A. Warrenburg. 2019. https://osf.io/79f4d/
Abstract: Recent evidence at the intersection of pharmacology and psychology suggests that pharmaceutical products and other drugs can exert previously unrecognized effects on consumers’ judgments, emotions, and behavior. We highlight the importance of a wider perspective for marketing science by proposing novel questions about how drugs might influence consumers. As a model for this framework, we review recently discovered effects of the over-the-counter pain reliever acetaminophen, which can alter consumers’ emotional experiences and their economic behavior well beyond soothing their aches and pains, and also present novel data on its memory effects. Observing effects of putatively benign over-the-counter medicines that extend beyond their originally approved usages suggests that many other drugs are also likely to influence processes relevant for consumers. The ubiquity of drug consumption—medical or recreational, legal or otherwise—underscores the importance of considering several novel research directions for understanding pharmacological-psychological interactions on consumer judgments, emotions, and behaviors.
Keywords: Decision making Emotion Memory Pharmaceuticals Substances Acetaminophen
Check also Conference Talk—SMPC 2019: Effect of Acetaminophen on Emotional Sounds. Lindsay A. Warrenburg. 2019. https://osf.io/79f4d/
Description: The capacity of listeners to perceive or experience emotions in response to music, speech, and natural sounds depends on many factors including dispositional traits, empathy, and enculturation. Emotional responses are also known to be mediated by pharmacological factors, including both legal and illegal drugs. Existing research has established that acetaminophen, a common over-the-counter pain medication, blunts emotional responses to visual stimuli (e.g., Durso, Luttrell, & Way, 2015). The current study extends this research by examining possible effects of acetaminophen on both perceived and felt responses to emotionally-charged sound stimuli. Additionally, it tests whether the effects of acetaminophen are specific for particular emotions (e.g., sadness, fear) or whether acetaminophen blunts emotional responses in general. Finally, the study tests whether acetaminophen has similar or differential effects on three categories of sound: music, speech, and natural sounds. The experiment employs a randomized, double-blind, parallel-group, placebo-controlled design. Participants are randomly assigned to ingest acetaminophen or a placebo. Then, the listeners are asked to complete two experimental blocks regarding musical and non-musical sounds. The first block asks participants to judge the extent to which a sound conveys a certain affect (on a Likert scale). The second block aims to examine a listener’s emotional responses to sound stimuli (also on a Likert scale). In light of the fact that some 50 million Americans take acetaminophen each week, this study suggests that future studies in music and emotion might consider controlling for the pharmacological state of participants.
Sunday, October 13, 2019
Some people share knowledge online, often without tangible compensation; they are motivated to signal general intelligence g; observers infer g from contributions' quality
The quality of online knowledge sharing signals general intelligence. Christian N.Yoder, Scott A.Reid. Personality and Individual Differences, Volume 148, 1 October 2019, Pages 90-94. https://doi.org/10.1016/j.paid.2019.05.013
Abstract: Some people share knowledge online, often without tangible compensation. Who does this, when, and why? According to costly signaling theory people use behavioral displays to provide observers with useful information about traits or states in exchange for fitness benefits. We tested whether individuals higher in general intelligence, g, provided better quality contributions to an information pool under high than low identifiability, and whether observers could infer signaler g from contribution quality. Using a putative online wiki (N = 98) we found that as individuals' scores on Ravens Progressive Matrices (RPM) increased, participants were judged to have written better quality articles, but only when identifiable and not when anonymous. Further, the effect of RPM scores on inferred intelligence was mediated by article quality, but only when signalers were identifiable. Consistent with costly signaling theory, signalers are extrinsically motivated and observers act as “naive psychometricians.” We discuss the implications for understanding online information pools and altruism.
Abstract: Some people share knowledge online, often without tangible compensation. Who does this, when, and why? According to costly signaling theory people use behavioral displays to provide observers with useful information about traits or states in exchange for fitness benefits. We tested whether individuals higher in general intelligence, g, provided better quality contributions to an information pool under high than low identifiability, and whether observers could infer signaler g from contribution quality. Using a putative online wiki (N = 98) we found that as individuals' scores on Ravens Progressive Matrices (RPM) increased, participants were judged to have written better quality articles, but only when identifiable and not when anonymous. Further, the effect of RPM scores on inferred intelligence was mediated by article quality, but only when signalers were identifiable. Consistent with costly signaling theory, signalers are extrinsically motivated and observers act as “naive psychometricians.” We discuss the implications for understanding online information pools and altruism.
Brain Syndrome Can Make Owners Think Pets Are Impostors
Herzog, Harold, "Brain Syndrome Can Make Owners Think Pets Are Impostors" (2018). 'Animals and Us' Blog Posts. 103, Apr 25, 2018. https://animalstudiesrepository.org/aniubpos/103
Mary was 40 years old when she became convinced that Sarah, her 9 year old daughter, was an impostor. The real Sarah, she told relatives, had been taken away and placed into a foster home. She claimed social workers had replaced her actual child with an identical-looking impostor. Mary was so convinced of this substitution that she would sometimes refuse to pick her daughter up at school. Mary would scream to the teachers, “Give me my real daughter back, I know what you have done!”
To no avail, her family and health care providers tried to convince Mary that no substitution had occurred, that Sarah was indeed her real daughter. But even after Mary was treated with risperiodone, a powerful anti-psychotic drug, she held on to the delusion. The local Department of Social Services became concerned about her ability to raise a child. And when it became apparent that Mary could no longer provide care for the daughter who she believed was an impostor, they successfully sought legal guardianship for Sarah. At one point during the hearing, Sarah told the court, “I love my mother, except when she doesn’t believe I’m me.”
As described in an article by Drs. Jeremy Matuszak and Matthew Parra in the journal Psychiatric Times, Mary was suffering from Capgras Syndrome. This is a rare variant of a group of neuropsychiatric conditions called delusional misidentification disorders. First identified in 1923 by the French psychiatrists Joseph Capgras and Reboul-Lachaux, individuals with the Capgras delusion come to believe that a person they know has been replaced by an identical-looking impostor. Usually the target of the delusion is a family member or a loved one. In Mary’s case, it was her young daughter.
[...]
What Causes Animal Capgras Delusions?
I admit that nine cases is a small sample, but some interesting patterns did emerge among this group of patients. For example, twice as many women as men thought they were living with impostor animals. And, as group, the patients tended to be on the old side. Six of eight individuals were over 50, and half were in their late 60’s or older. While only two of the patients had suffered identifiable brain damage, nearly all the patients had been diagnosed with a functional psychosis, usually a form of schizophrenia. Finally, all seven of the patients which there was information on treatments were given anti-psychotic drugs. In nearly all of these cases, their pet impostor delusions diminished, and in several cases, they seemed to have disappeared.
Speculations about the causes of Capras syndrome abound. The some researchers argue that impostor delusions are a way of subconsciously dealing with love-hate conflicts. V. S. Ramachandran believes that imposter delusions result from disconnections between emotional and face recognition centers in the brain. Others argue it is usually the result of degenerative diseases such as Alzheimer’s and Lewy body disease. Indeed, impostor delusions have been associated with a wide array of conditions including psychiatric disorders, strokes, tumors, epilepsy, and even vitamin deficiencies and drug use.
Mary was 40 years old when she became convinced that Sarah, her 9 year old daughter, was an impostor. The real Sarah, she told relatives, had been taken away and placed into a foster home. She claimed social workers had replaced her actual child with an identical-looking impostor. Mary was so convinced of this substitution that she would sometimes refuse to pick her daughter up at school. Mary would scream to the teachers, “Give me my real daughter back, I know what you have done!”
To no avail, her family and health care providers tried to convince Mary that no substitution had occurred, that Sarah was indeed her real daughter. But even after Mary was treated with risperiodone, a powerful anti-psychotic drug, she held on to the delusion. The local Department of Social Services became concerned about her ability to raise a child. And when it became apparent that Mary could no longer provide care for the daughter who she believed was an impostor, they successfully sought legal guardianship for Sarah. At one point during the hearing, Sarah told the court, “I love my mother, except when she doesn’t believe I’m me.”
As described in an article by Drs. Jeremy Matuszak and Matthew Parra in the journal Psychiatric Times, Mary was suffering from Capgras Syndrome. This is a rare variant of a group of neuropsychiatric conditions called delusional misidentification disorders. First identified in 1923 by the French psychiatrists Joseph Capgras and Reboul-Lachaux, individuals with the Capgras delusion come to believe that a person they know has been replaced by an identical-looking impostor. Usually the target of the delusion is a family member or a loved one. In Mary’s case, it was her young daughter.
[...]
What Causes Animal Capgras Delusions?
I admit that nine cases is a small sample, but some interesting patterns did emerge among this group of patients. For example, twice as many women as men thought they were living with impostor animals. And, as group, the patients tended to be on the old side. Six of eight individuals were over 50, and half were in their late 60’s or older. While only two of the patients had suffered identifiable brain damage, nearly all the patients had been diagnosed with a functional psychosis, usually a form of schizophrenia. Finally, all seven of the patients which there was information on treatments were given anti-psychotic drugs. In nearly all of these cases, their pet impostor delusions diminished, and in several cases, they seemed to have disappeared.
Speculations about the causes of Capras syndrome abound. The some researchers argue that impostor delusions are a way of subconsciously dealing with love-hate conflicts. V. S. Ramachandran believes that imposter delusions result from disconnections between emotional and face recognition centers in the brain. Others argue it is usually the result of degenerative diseases such as Alzheimer’s and Lewy body disease. Indeed, impostor delusions have been associated with a wide array of conditions including psychiatric disorders, strokes, tumors, epilepsy, and even vitamin deficiencies and drug use.
Targeted Memory Reactivation During Sleep Improves Next-Day Problem Solving
Targeted Memory Reactivation During Sleep Improves Next-Day Problem Solving. Kristin E. G. Sanders et al. Psychological Science, October 11, 2019. https://doi.org/10.1177/0956797619873344
Abstract: Many people have claimed that sleep has helped them solve a difficult problem, but empirical support for this assertion remains tentative. The current experiment tested whether manipulating information processing during sleep impacts problem incubation and solving. In memory studies, delivering learning-associated sound cues during sleep can reactivate memories. We therefore predicted that reactivating previously unsolved problems could help people solve them. In the evening, we presented 57 participants with puzzles, each arbitrarily associated with a different sound. While participants slept overnight, half of the sounds associated with the puzzles they had not solved were surreptitiously presented. The next morning, participants solved 31.7% of cued puzzles, compared with 20.5% of uncued puzzles (a 55% improvement). Moreover, cued-puzzle solving correlated with cued-puzzle memory. Overall, these results demonstrate that cuing puzzle information during sleep can facilitate solving, thus supporting sleep’s role in problem incubation and establishing a new technique to advance understanding of problem solving and sleep cognition.
Keywords problem solving, incubation, sleep, targeted memory reactivation, restructuring, creative cognition
Abstract: Many people have claimed that sleep has helped them solve a difficult problem, but empirical support for this assertion remains tentative. The current experiment tested whether manipulating information processing during sleep impacts problem incubation and solving. In memory studies, delivering learning-associated sound cues during sleep can reactivate memories. We therefore predicted that reactivating previously unsolved problems could help people solve them. In the evening, we presented 57 participants with puzzles, each arbitrarily associated with a different sound. While participants slept overnight, half of the sounds associated with the puzzles they had not solved were surreptitiously presented. The next morning, participants solved 31.7% of cued puzzles, compared with 20.5% of uncued puzzles (a 55% improvement). Moreover, cued-puzzle solving correlated with cued-puzzle memory. Overall, these results demonstrate that cuing puzzle information during sleep can facilitate solving, thus supporting sleep’s role in problem incubation and establishing a new technique to advance understanding of problem solving and sleep cognition.
Keywords problem solving, incubation, sleep, targeted memory reactivation, restructuring, creative cognition
When recipients open a gift from a friend, they like it less when the giver has wrapped it neatly as opposed to sloppily and we draw on expectation disconfirmation theory to explain the effect
Presentation Matters: The Effect of Wrapping Neatness on Gift Attitudes. Jessica M. Rixom Erick M. Mas Brett A. Rixom. Journal of Consumer Psychology, October 11 2019. https://doi.org/10.1002/jcpy.1140
Abstract: While gift‐givers typically wrap gifts prior to presenting them, little is known about the effect of how the gift is wrapped on recipients’ expectations and attitudes toward the gift inside. We propose that when recipients open a gift from a friend, they like it less when the giver has wrapped it neatly as opposed to sloppily and we draw on expectation disconfirmation theory to explain the effect. Specifically, recipients set higher (lower) expectations for neatly (sloppily)‐wrapped gifts, making it harder (easier) for the gifts to meet these expectations, resulting in contrast effects that lead to less (more) positive attitudes toward the gifts once unwrapped. However, when the gift‐giver is an acquaintance, there is ambiguity in the relationship status and wrapping neatness serves as a cue about the relationship rather than the gift itself. This leads to assimilation effects where the recipient likes the gift more when neatly wrapped. We assess these effects across three studies and find that they hold for desirable, neutral, and undesirable gifts, as well as with both hypothetical and real gifts.
Abstract: While gift‐givers typically wrap gifts prior to presenting them, little is known about the effect of how the gift is wrapped on recipients’ expectations and attitudes toward the gift inside. We propose that when recipients open a gift from a friend, they like it less when the giver has wrapped it neatly as opposed to sloppily and we draw on expectation disconfirmation theory to explain the effect. Specifically, recipients set higher (lower) expectations for neatly (sloppily)‐wrapped gifts, making it harder (easier) for the gifts to meet these expectations, resulting in contrast effects that lead to less (more) positive attitudes toward the gifts once unwrapped. However, when the gift‐giver is an acquaintance, there is ambiguity in the relationship status and wrapping neatness serves as a cue about the relationship rather than the gift itself. This leads to assimilation effects where the recipient likes the gift more when neatly wrapped. We assess these effects across three studies and find that they hold for desirable, neutral, and undesirable gifts, as well as with both hypothetical and real gifts.
Saturday, October 12, 2019
Random effects meta-analyses showed that social media use was significantly & positively related to affective empathy; effects were generally small in size and do not establish causality
Social Media Use and Empathy: A Mini Meta-Analysis. Shu-Sha Angie Guan, Sophia Hain, Jennifer Cabrera, Andrea Rodarte. Computer Science & Communications, Vol.8 No.4, October 2019, pp. 147-157. DOI: 10.4236/sn.2019.84010
ABSTRACT: Concerns about the effects of social media or social networking site (SNS) use on prosocial development are increasing. The aim of the current study is to meta-analytically summarize the research to date (k = 5) about the relationship between general SNS use and two components of empathy (i.e., empathic concern and perspective-taking). Random effects meta-analyses showed that SNS use was significantly and positively related to affective empathy though only marginally related to cognitive empathy. These effects were generally small in size and do not establish causality. Future research should explore how specific behaviors are related to different forms of empathy.
KEYWORDS: Social Media, Empathic Concern, Perspective-Taking
4. Discussion
Despite the decreases in empathy coupled with increases in media use at the societal level [13] , individual social media use in terms of frequency or time spent per day appears to be related to higher levels of empathy, particularly affective empathy. Even though the associations were small, they trended positive. However, there may be some online behaviors that cultivate empathy (e.g., sharing emotions, expressing support [21] ) more than others (e.g., updating profile photos [20] ). In combination with emerging longitudinal evidence that social media use at one time point is predictive of higher levels of cognitive and affective empathy one year later among adolescents [42] and experimental work that shows that interdependent Facebook use can promote relational orientation [37] , this study contributes to the growing literature on how social media can facilitate positive psychosocial development.
Although promising, there are limitations of the current meta-analysis to consider. This study aimed to look only at global measures of social media use in everyday life and, because of this inclusion parameter, includes a small sample of studies and effect sizes. This likely limits the generalizability of the results and our ability to detect differences by moderators (gender, age). Also, the results are correlational and do not establish causality. Previous research suggests that individuals who are prosocial offline are often prosocial online [29]. Despite our attempts to narrow the scope, there remained variability in the measures of media use and study parameters as indicated by the heterogeneity index. Given the wide range of online activities, future studies should explore how specific behaviors are related to different forms of empathy (e.g., helping strangers vs. family or friends [25] ). Additionally, the social media landscape is constantly evolving and this study captures media use as assessed by recent studies in one moment in time. Cultural psychologists suggest that changes in technology use, as part of larger shifting sociodemographic and ecological changes, can shape cultural values and learning environments in ways that directly affect human development across time [43].
It is also important to note that all of the studies included, and much of media research in general, have been conducted in industrialized, individualistic countries like the United States. This limited our ability to detect cultural differences. On the one hand, the most popular SNSs are often developed in Western cultures and can reflect the highly individualistic values of their developers and users [37] [44]. On the other hand, the Internet is a “global village” of individuals from various nationalities and cultural backgrounds with nearly 60% of the online population residing outside of the U.S. [44]. These diverse offline cultural values can be reflected in the online [45] - [52]. Additionally, there may be values and goals specific to the SNS context outside of the values that users bring with them [53]. Previous meta-analyses suggest that the effects of media use may be stronger in non-Western countries [26]. Future research should explore how cultural values in the online and offline interact in shaping development.
Although limited, this meta-analysis provides useful insights into the media-empathy paradox [13]. Additionally, it may be informative in better understanding growing generations of adolescents and young adults who have become the first generations to have grown up fully immersed in digital media (i.e., “digital natives”) having been born around or after the 1990s when the Internet was first commercially launched. This may mean that psychosocial development for these “digital natives” differs from prior generations of “digital immigrants” [9]. For example, greater face-to-face communication with family members, close friends, and acquaintances was associated with higher levels of psychological well-being (e.g., life meaning, relationship quality) for older adults age 35 - 54 but not for young adults age 18 - 34 [54]. As technology transforms society, social relationships, and media landscapes, it will become ever important to track how these changes affect individuals and their development.
ABSTRACT: Concerns about the effects of social media or social networking site (SNS) use on prosocial development are increasing. The aim of the current study is to meta-analytically summarize the research to date (k = 5) about the relationship between general SNS use and two components of empathy (i.e., empathic concern and perspective-taking). Random effects meta-analyses showed that SNS use was significantly and positively related to affective empathy though only marginally related to cognitive empathy. These effects were generally small in size and do not establish causality. Future research should explore how specific behaviors are related to different forms of empathy.
KEYWORDS: Social Media, Empathic Concern, Perspective-Taking
4. Discussion
Despite the decreases in empathy coupled with increases in media use at the societal level [13] , individual social media use in terms of frequency or time spent per day appears to be related to higher levels of empathy, particularly affective empathy. Even though the associations were small, they trended positive. However, there may be some online behaviors that cultivate empathy (e.g., sharing emotions, expressing support [21] ) more than others (e.g., updating profile photos [20] ). In combination with emerging longitudinal evidence that social media use at one time point is predictive of higher levels of cognitive and affective empathy one year later among adolescents [42] and experimental work that shows that interdependent Facebook use can promote relational orientation [37] , this study contributes to the growing literature on how social media can facilitate positive psychosocial development.
Although promising, there are limitations of the current meta-analysis to consider. This study aimed to look only at global measures of social media use in everyday life and, because of this inclusion parameter, includes a small sample of studies and effect sizes. This likely limits the generalizability of the results and our ability to detect differences by moderators (gender, age). Also, the results are correlational and do not establish causality. Previous research suggests that individuals who are prosocial offline are often prosocial online [29]. Despite our attempts to narrow the scope, there remained variability in the measures of media use and study parameters as indicated by the heterogeneity index. Given the wide range of online activities, future studies should explore how specific behaviors are related to different forms of empathy (e.g., helping strangers vs. family or friends [25] ). Additionally, the social media landscape is constantly evolving and this study captures media use as assessed by recent studies in one moment in time. Cultural psychologists suggest that changes in technology use, as part of larger shifting sociodemographic and ecological changes, can shape cultural values and learning environments in ways that directly affect human development across time [43].
It is also important to note that all of the studies included, and much of media research in general, have been conducted in industrialized, individualistic countries like the United States. This limited our ability to detect cultural differences. On the one hand, the most popular SNSs are often developed in Western cultures and can reflect the highly individualistic values of their developers and users [37] [44]. On the other hand, the Internet is a “global village” of individuals from various nationalities and cultural backgrounds with nearly 60% of the online population residing outside of the U.S. [44]. These diverse offline cultural values can be reflected in the online [45] - [52]. Additionally, there may be values and goals specific to the SNS context outside of the values that users bring with them [53]. Previous meta-analyses suggest that the effects of media use may be stronger in non-Western countries [26]. Future research should explore how cultural values in the online and offline interact in shaping development.
Although limited, this meta-analysis provides useful insights into the media-empathy paradox [13]. Additionally, it may be informative in better understanding growing generations of adolescents and young adults who have become the first generations to have grown up fully immersed in digital media (i.e., “digital natives”) having been born around or after the 1990s when the Internet was first commercially launched. This may mean that psychosocial development for these “digital natives” differs from prior generations of “digital immigrants” [9]. For example, greater face-to-face communication with family members, close friends, and acquaintances was associated with higher levels of psychological well-being (e.g., life meaning, relationship quality) for older adults age 35 - 54 but not for young adults age 18 - 34 [54]. As technology transforms society, social relationships, and media landscapes, it will become ever important to track how these changes affect individuals and their development.
Is intense pleasure necessary for intense beauty? If so, the inability to experience pleasure (anhedonia) should prevent the experience of intense beauty
Intense beauty requires intense pleasure. Aenne A. Brielmann1 and Denis Pelli. Front. Psychol., Oct 11 2019 (provisionally accepted, no full text available). doi: 10.3389/fpsyg.2019.02420
Abstract: At the beginning of psychology, Fechner (1876) claimed that beauty is immediate pleasure, and that an object’s pleasure determines its value. In our earlier work, we found that intense pleasure always results in intense beauty. Here, we focus on the complementary question: Is intense pleasure necessary for intense beauty? If so, the inability to experience pleasure (anhedonia) should prevent the experience of intense beauty. We asked 757 participants to rate how intensely they felt beauty from each image. We used 900 OASIS images along with their available valence (pleasure vs. displeasure) and arousal ratings. We then obtained self-reports of anhedonia (TEPS), mood, and depression (PHQ-9). Across images, beauty ratings were closely related to pleasure ratings (r = 0.75), yet unrelated to arousal ratings. Only images with an average pleasure rating above 4 (of a possible 7) often achieved (> 10%) beauty averages exceeding the overall median beauty. For normally beautiful images (average rating > 4.5), the beauty ratings were correlated with anhedonia (r ~ -0.3) and mood (r ~ 0.3), yet unrelated to depression. Comparing each participant’s average beauty rating to the overall median, none of the most anhedonic participants exceeded the median, whereas 50% of the remaining participants did. Thus, both general and anhedonic results support the claim that intense beauty requires intense pleasure. In addition, follow-up repeated measures showed that shared taste contributed 19% to beauty-rating variance, only one third as much as personal taste (58%). Addressing age-old questions, these results indicate that beauty is a kind of pleasure, and that beauty is more personal than universal, i.e., 1.7 times more correlated with individual than with shared taste.
Keywords: beauty, aesthetics, Pleasure, Anhedonia, Depression
Abstract: At the beginning of psychology, Fechner (1876) claimed that beauty is immediate pleasure, and that an object’s pleasure determines its value. In our earlier work, we found that intense pleasure always results in intense beauty. Here, we focus on the complementary question: Is intense pleasure necessary for intense beauty? If so, the inability to experience pleasure (anhedonia) should prevent the experience of intense beauty. We asked 757 participants to rate how intensely they felt beauty from each image. We used 900 OASIS images along with their available valence (pleasure vs. displeasure) and arousal ratings. We then obtained self-reports of anhedonia (TEPS), mood, and depression (PHQ-9). Across images, beauty ratings were closely related to pleasure ratings (r = 0.75), yet unrelated to arousal ratings. Only images with an average pleasure rating above 4 (of a possible 7) often achieved (> 10%) beauty averages exceeding the overall median beauty. For normally beautiful images (average rating > 4.5), the beauty ratings were correlated with anhedonia (r ~ -0.3) and mood (r ~ 0.3), yet unrelated to depression. Comparing each participant’s average beauty rating to the overall median, none of the most anhedonic participants exceeded the median, whereas 50% of the remaining participants did. Thus, both general and anhedonic results support the claim that intense beauty requires intense pleasure. In addition, follow-up repeated measures showed that shared taste contributed 19% to beauty-rating variance, only one third as much as personal taste (58%). Addressing age-old questions, these results indicate that beauty is a kind of pleasure, and that beauty is more personal than universal, i.e., 1.7 times more correlated with individual than with shared taste.
Keywords: beauty, aesthetics, Pleasure, Anhedonia, Depression
Effects of heroin on rat prosocial behavior: They stopped freeing the trapped cagemate, continued to self-administer the drug
From 2018... Effects of heroin on rat prosocial behavior. Seven E. Tomek Gabriela M. Stegmann M. Foster Olive. Addiction Biology, May 4 2018. https://doi.org/10.1111/adb.12633
Abstract: Opioid use disorders are characterized in part by impairments in social functioning. Previous research indicates that laboratory rats, which are frequently used as animal models of addiction‐related behaviors, are capable of prosocial behavior. For example, under normal conditions, when a ‘free’ rat is placed in the vicinity of rat trapped in a plastic restrainer, the rat will release or ‘rescue’ the other rat from confinement. The present study was conducted to determine the effects of heroin on prosocial behavior in rats. For 2 weeks, rats were given the opportunity to rescue their cagemate from confinement, and the occurrence of and latency to free the confined rat was recorded. After baseline rescuing behavior was established, rats were randomly selected to self‐administer heroin (0.06 mg/kg/infusion i.v.) or sucrose pellets (orally) for 14 days. Next, rats were retested for rescuing behavior once daily for 3 days, during which they were provided with a choice between freeing the trapped cagemate and continuing to self‐administer their respective reinforcer. Our results indicate that rats self‐administering sucrose continued to rescue their cagemate, whereas heroin rats chose to self‐administer heroin and not rescue their cagemate. These findings suggest that rats with a history of heroin self‐administration show deficits in prosocial behavior, consistent with specific diagnostic criteria for opioid use disorder. Behavioral paradigms providing a choice between engaging in prosocial behavior and continuing drug use may be useful in modeling and investigating the neural basis of social functioning deficits in opioid addiction.
Abstract: Opioid use disorders are characterized in part by impairments in social functioning. Previous research indicates that laboratory rats, which are frequently used as animal models of addiction‐related behaviors, are capable of prosocial behavior. For example, under normal conditions, when a ‘free’ rat is placed in the vicinity of rat trapped in a plastic restrainer, the rat will release or ‘rescue’ the other rat from confinement. The present study was conducted to determine the effects of heroin on prosocial behavior in rats. For 2 weeks, rats were given the opportunity to rescue their cagemate from confinement, and the occurrence of and latency to free the confined rat was recorded. After baseline rescuing behavior was established, rats were randomly selected to self‐administer heroin (0.06 mg/kg/infusion i.v.) or sucrose pellets (orally) for 14 days. Next, rats were retested for rescuing behavior once daily for 3 days, during which they were provided with a choice between freeing the trapped cagemate and continuing to self‐administer their respective reinforcer. Our results indicate that rats self‐administering sucrose continued to rescue their cagemate, whereas heroin rats chose to self‐administer heroin and not rescue their cagemate. These findings suggest that rats with a history of heroin self‐administration show deficits in prosocial behavior, consistent with specific diagnostic criteria for opioid use disorder. Behavioral paradigms providing a choice between engaging in prosocial behavior and continuing drug use may be useful in modeling and investigating the neural basis of social functioning deficits in opioid addiction.
From 2001... Need to hire a private certified arborist at a cost of $500-$2,000 to take pictures, prepare a report & perhaps to recommend protective pruning or other measures before a tree can be cut
From 2001... The (Almost) Untouchables of California. Todd S. Purdum. , Aug. 29, 2001, Section A, Page 1. https://www.nytimes.com/2001/08/29/us/the-almost-untouchables-of-california.html
'Anything that's going to happen under this tree has to be addressed,' said Mr. Sartain, a third-generation arborist, surveying the tree's 90-foot canopy with the cheerful, clinical detachment of your favorite pediatrician. 'There's a lot of issues.'
Indeed, Mr. Sartain's visit is only the first step in a process that will require the homeowner, who asked not to be named, to hire a private certified arborist at a cost of $500 to $2,000 to take pictures, prepare a report and perhaps to recommend protective pruning or other measures before a permit is issued and construction can proceed. Penalties for removing a tree like this, worth perhaps $100,000 under city guidelines because of its size and age, could force an offender to plant trees worth an equivalent amount.
Santa Clarita is not alone.
In the past 30 years, as development pressures increased, scores of California cities and counties from Thousand Oaks in the south to Santa Rosa in the north have passed ordinances protecting not only various species and sizes of oaks, but also sycamores, walnuts, eucalyptus and other trees with a zeal that might make the poet Joyce Kilmer blush.
The specifics vary widely, but the ordinances have one goal in common: protecting trees that are almost as storied in California as its redwoods and that have long been threatened by ranching, wine-making, suburban sprawl and, more recently, mysterious diseases.
'Anything that's going to happen under this tree has to be addressed,' said Mr. Sartain, a third-generation arborist, surveying the tree's 90-foot canopy with the cheerful, clinical detachment of your favorite pediatrician. 'There's a lot of issues.'
Indeed, Mr. Sartain's visit is only the first step in a process that will require the homeowner, who asked not to be named, to hire a private certified arborist at a cost of $500 to $2,000 to take pictures, prepare a report and perhaps to recommend protective pruning or other measures before a permit is issued and construction can proceed. Penalties for removing a tree like this, worth perhaps $100,000 under city guidelines because of its size and age, could force an offender to plant trees worth an equivalent amount.
Santa Clarita is not alone.
In the past 30 years, as development pressures increased, scores of California cities and counties from Thousand Oaks in the south to Santa Rosa in the north have passed ordinances protecting not only various species and sizes of oaks, but also sycamores, walnuts, eucalyptus and other trees with a zeal that might make the poet Joyce Kilmer blush.
The specifics vary widely, but the ordinances have one goal in common: protecting trees that are almost as storied in California as its redwoods and that have long been threatened by ranching, wine-making, suburban sprawl and, more recently, mysterious diseases.
Children from Namibia and Germany: Being observed increases overimitation in three diverse cultures
Stengelin, R., Hepach, R., & Haun, D. B. M. (2019). Being observed increases overimitation in three diverse cultures. Developmental Psychology, Oct 2019. http://dx.doi.org/10.1037/dev0000832
Abstract: From a young age, children in Western, industrialized societies overimitate others’ actions. However, the underlying motivation and cultural specificity of this behavior have remained unclear. Here, 3- to 8-year-old children (N = 125) from two rural Namibian populations (Hai||om and Ovambo) and one urban German population were tested in two versions of an overimitation paradigm. Across cultures, children selectively imitated more actions when the adult model was present compared to being absent, denoting a social motivation underlying overimitation. At the same time, children’s imitation was not linked to their tendency to reengage the adult in a second, independent measure of social motivation. These results suggest that, across diverse cultures, children’s imitative behavior is actuated by the attentive state of the model.
Abstract: From a young age, children in Western, industrialized societies overimitate others’ actions. However, the underlying motivation and cultural specificity of this behavior have remained unclear. Here, 3- to 8-year-old children (N = 125) from two rural Namibian populations (Hai||om and Ovambo) and one urban German population were tested in two versions of an overimitation paradigm. Across cultures, children selectively imitated more actions when the adult model was present compared to being absent, denoting a social motivation underlying overimitation. At the same time, children’s imitation was not linked to their tendency to reengage the adult in a second, independent measure of social motivation. These results suggest that, across diverse cultures, children’s imitative behavior is actuated by the attentive state of the model.
Subscribe to:
Posts (Atom)