Humans Trust Central Vision More Than Peripheral Vision Even in the Dark. Alejandro H.Gloriani, Alexander C.Schütz. Current Biology, March 21 2019. https://doi.org/10.1016/j.cub.2019.02.023
Highlights
• Veridical information from the fovea is preferred under photopic viewing
• Information missing in the scotopic foveal scotoma is filled in from the surround
• Inferred information from the fovea is preferred under scotopic viewing
• Content and properties of the foveal scotopic scotoma are hidden from awareness
Summary: Two types of photoreceptors in the human retina support vision across a wide range of luminances: cones are active under bright daylight illumination (photopic viewing) and rods under dim illumination at night (scotopic viewing). These photoreceptors are distributed inhomogeneously across the retina [1]: cone-receptor density peaks at the center of the visual field (i.e., the fovea) and declines toward the periphery, allowing for high-acuity vision at the fovea in daylight. Rod receptors are absent from the fovea, leading to a functional foveal scotoma in night vision. In order to make optimal perceptual decisions, the visual system requires knowledge about its own properties and the relative reliability of signals arriving from different parts of the visual field [2]. Since cone and rod signals converge on the same pathways [3], and their cortical processing is similar except for the foveal scotoma [4], it is unclear if humans can take into account the differences between scotopic and photopic vision when making perceptual decisions. Here, we show that the scotopic foveal scotoma is filled in with information from the immediate surround and that humans trust this inferred information more than veridical information from the periphery of the visual field. We observed a similar preference under daylight illumination, indicating that humans have a default preference for information from the fovea even if this information is not veridical, like in night vision. This suggests that filling-in precedes the estimation of confidence, thereby shielding awareness from the foveal scotoma with respect to its contents and its properties.
---
Discussion
We investigated perceptual decision making under scotopic and photopic viewing. In two experiments, we found that a stimulus with a discontinuity in the scotopic foveal scotoma appeared as continuous, providing evidence for perceptual filling-in of the scotoma. We also found that observers preferred information from central vision, even when it was not veridical under scotopic viewing. This general preference for central vision indicates that humans are not aware of their scotopic foveal scotoma and that it is not taken into account for perceptual decision making.
Under daylight illumination, basic perceptual measures, such as acuity [13] or contrast sensitivity [14], peak at the fovea and decline in the periphery. In addition, the periphery is more vulnerable to crowding—i.e., spatial interference between neighboring elements [15]. Preferring information from central vision might be, therefore, a sensible strategy for decision making under ambiguity in photopic vision. This interpretation is supported by other foveal biases in photopic vision: stimuli with temporal and spatial uncertainty tend to be mislocalized toward the fovea [16], foveal brightness is extrapolated into the periphery [17], peripheral appearance is influenced by predicted foveal appearance [18, 19], and transsaccadic feature integration shows some overweighting of foveal information [20, 21]. However, the observed perceptual bias is not a useful strategy for scotopic vision, where the fovea does not contribute veridical information. Nevertheless, our finding is consistent with other perceptual phenomena where vision in the light and the dark is not calibrated well: perceived speed is underestimated in the dark [22], and the perception of white seems to require signals from cones [23]. Our results are at odds with a recent comparison of photopic and scotopic visual search [24], where eye movement statistics are affected by lighting condition in a qualitatively similar way as an ideal searcher [25], which has knowledge about the scotopic foveal scotoma. These divergent findings could point toward a general dissociation that the scotopic foveal scotoma is taken into account in eye movement control, but not in perceptual decision making. Alternatively, the divergent findings might be caused by different opportunities for learning in the two experimental paradigms. In the visual search task, observers experienced with every eye movement how visual input in the fovea and the periphery relate to each other and therefore had the opportunity to acquire the appropriate weighting of foveal and peripheral information. In the perceptual decision task of the current study, observers never experienced the same stimulus in the fovea and the periphery and therefore could not acquire the appropriate weighting during the experiment.
There are at least two ways how the perceptual bias could be caused in scotopic vision: first, the brain might use a simple heuristic that information from the fovea is more reliable than from the periphery and apply this heuristic to photopic and scotopic vision alike. However, a simple heuristic is unlikely, because humans can estimate uncertainty based on their actual perceptual performance instead of using simple cues, such as contrast or eccentricity in photopic vision [26]. Second, confidence might be assessed for each stimulus individually also in scotopic vision. In this case, our finding that biases in photopic and scotopic vision were similar, suggesting that confidence is assessed at a level of processing where information about the originating photoreceptor type is lost and perceptual filling-in is completed. Such a dissociation is quite likely, because rod and cone photoreceptors converge on the same pathways at the level of retinal ganglion cells [27, 28] and filling-in is preattentive [29] and takes place in visual cortex [9], while confidence in contrast seems to be represented only further downstream in parietal [30] and prefrontal cortex [31] and the striatum [32].
Several basic properties of visual processing, such as pupil size [33] or photoreceptor sensitivity [34], are directly adjusted to the light level during dark adaptation. Our results show that this is not the case for the relative weighting of foveal and peripheral information in perceptual decision making. However, other properties, such as rod-cone interactions [35] or spontaneous cortical activity [36], are controlled by a circadian rhythm rather than by light level. Since our measurements were taken during the day, it is possible that the relative weighting of foveal and peripheral information is also controlled by a circadian rhythm. In this case, the bias for foveal information should be reduced or even reversed at night but possibly in the same way for both scotopic and photopic viewing.
While there are only few and contradictory studies about filling-in of the scotopic foveal scotoma [6, 7, 10], more is known about filling-in at the blind-spot, where photoreceptors are absent because the axons of the ganglion cells exit the eye ball. Here, even complex visual patterns can be filled in from the surround [29], and humans are overconfident for this filled-in information [5]. Filling-in has also been observed for scotomata in the fovea caused by macular disease [37], and these patients need to acquire a new preferred retinal locus for fixation [38]. Our finding of a general preference for foveal information, irrespective of whether it is veridical or not, suggests that preferences in perceptual decision making might not necessarily shift to the preferred retinal locus in those patients, leading to suboptimal perceptual decisions.
Thursday, March 21, 2019
Complex societies precede moralizing gods throughout world history
Complex societies precede moralizing gods throughout world history. Harvey Whitehouse, Pieter François, Patrick E. Savage, Thomas E. Currie, Kevin C. Feeney, Enrico Cioni, Rosalind Purcell, Robert M. Ross, Jennifer Larson, John Baines, Barend ter Haar, Alan Covey & Peter Turchin. Nature, Mar 20 2019. https://doi.org/10.1038/s41586-019-1043-4
Abstract: The origins of religion and of complex societies represent evolutionary puzzles1–8. The ‘moralizing gods’ hypothesis offers a solution to both puzzles by proposing that belief in morally concerned supernatural agents culturally evolved to facilitate cooperation among strangers in large-scale societies9–13. Although previous research has suggested an association between the presence of moralizing gods and social complexity3,6,7,9–18, the relationship between the two is disputed9–13,19–24, and attempts to establish causality have been hampered by limitations in the availability of detailed global longitudinal data. To overcome these limitations, here we systematically coded records from 414 societies that span the past 10,000 years from 30 regions around the world, using 51 measures of social complexity and 4 measures of supernatural enforcement of morality. Our analyses not only confirm the association between moralizing gods and social complexity, but also reveal that moralizing gods follow—rather than precede—large increases in social complexity. Contrary to previous predictions9,12,16,18, powerful moralizing ‘big gods’ and prosocial supernatural punishment tend to appear only after the emergence of ‘megasocieties’ with populations of more than around one million people. Moralizing gods are not a prerequisite for the evolution of social complexity, but they may help to sustain and expand complex multi-ethnic empires after they have become established. By contrast, rituals that facilitate the standardization of religious traditions across large populations25,26 generally precede the appearance of moralizing gods. This suggests that ritual practices were more important than the particular content of religious belief to the initial rise of social complexity.
---
Supernatural agents that punish direct affronts to themselves (for
example, failure to perform sacrifices or observe taboos) are commonly
represented in global history, but rarely are such deities believed to
punish moral violations in interactions between humans
2
. Recent mil-
lennia, however, have seen the rise and spread of several ‘prosocial
religions’, which include either powerful ‘moralizing high gods’ (MHG;
for example, the Abrahamic God) or more general ‘broad supernatu-
ral punishment’ (BSP) of moral transgressions (for example, karma in
Buddhism)
9,12,16–18
. Such moralizing gods may have provided a crucial
mechanism for overcoming the classic free-rider problem in large-scale
societies
11
. The association between moralizing gods and complex
societies has been supported by two forms of evidence: psychological
experiments
3,6,27,28
and cross-cultural comparative analyses
7,11,14–18,20
.
The contributions of theistic beliefs to cooperation, as well as the his-
torical question of whether moralizing gods precede or follow the estab-
lishment of large-scale cooperation, have been much debated
9,10,12,23,24
.
Three recent studies that explicitly model temporal causality have come
to contrasting conclusions. One study, which applied phylogenetic
comparative methods to infer historical changes in Austronesian reli-
gions, reported that moralizing gods (BSP but not MHG) preceded the
evolution of complex societies
16
. The same conclusion was reached
in an analysis of historical and archaeological data from Viking-age
Scandinavia
18
. By contrast, another study of Eurasian empires has
reported that moralizing gods followed—rather than preceded—the
rise of complex, affluent societies
20
. However, all of these studies are
restricted in geographical scope and use proxies for social complexity
that the authors themselves concede are ‘very crude’
20
(for example,
thebinary classification of societies as ofeither high or low complexity).
To overcome these limitations, we used ‘Seshat: Global History
Databank’
29
,a repository of standardized data on social structure, reli-
gion and other domains for hundreds of societies throughout world his-
tory. In contrast to other databases that attempt to model history using
contemporary ethnographic data, Seshat directly samples over time as
well as space. Seshat also includes estimates of expert disagreement and
uncertainty, and uses more-detailed variables than many databases.
To test the moralizing gods hypothesis, we coded data on 55var-
iables from 414polities (independent political units) that occupied
30geographical regions from the beginning of the Neolithic period
to the beginning of Industrial and/or colonial periods (Fig.1 and
Supplementary Data). We used a recently developed and validated
measure of social complexity thatcondenses 51social complexity
variables (Extended Data Table5) into a single principal component
thatcaptures three quarters of the observed variation, which we call
‘social complexity’
8
. The remaining four variables were selected to test
the MHG and BSP subtypes of the moralizing gods hypothesis. The
MHG variable was coded following the MHG variable used as stand-
ardin the literature on this topic
11,14–17,30
, whichrequires that a high
god who created and/or governs the cosmos actively enforces human
morality. Because the concept of morality is complex, multidimensional
and in some respects culturally relative—and because not all moralizing
gods are ‘high gods’—we also coded three different variables related to
BSP that are specifically relevant to prosocial cooperation: reciprocity,
fairness and in-group loyalty. For analysis, these three variables were
combined into a single BSP variable. TheMethods, Supplementary
Information and http://seshatdatabank.info/methods/codebook pro-
vide further methodological details, definitions and justifications,
including adiscussion of the relationship between MHG, BSP and big
gods.
Figure1 and Extended Data Table1 show the temporal and geo-
graphical distribution of the appearance of moralizing gods in our
sample. Although societies in all 30regions possessed beliefs about
appeasing supernatural agents through the performance of rituals, in 10
out of the 30regions, there was no evidence for moralizing gods before
their introduction under colonial powers. The remaining 20regions displayed a diverse range of 15different systems of belief in moraliz-
ing gods: in some, the first evidence of moralizing gods came in the
form of MHG and in others it came in the form of BSP (Extended
Data Table1). The first appearance of moralizing gods in our sample
was in Egypt, where the concept of supernatural enforcement of
Maat (order) is attested by the Second Dynasty, around 2800. This
was followed by sporadic appearances in local religions throughout
Eurasia (Mesopotamia (around 2200), Anatolia (around 1500)
and China (around 1000)) before the wider spread of transnational
religions began during the first millennium with Zoroastrianism
and Buddhism, followed later by Christianity and Islam. Although
Christianity and Islam would eventually become the most widespread
religions, local forms of moralizing gods were present well before they
arrived in most regions (for example, Roman gods were believed to
punish oath-breaking from as early as 500, almost a millennium
before Christianity was adopted as the official Roman religion). The
diverse range of religious systems represented in our global sample
makes it possible to draw more general conclusions about religion than
have previously been possible.
Although our sampling scheme reduces non-independence, our
polities still cannot be considered statistically independent because
of the historical relationships among them. We controlled for these
using a logistic regression model to account for temporal, geographi-
cal and cultural dependencies in the global distribution of moralizing
gods (seeMethods). This analysis revealed that social complexity was
a stronger predictor of moralizing gods than temporal, geographical or
linguistic relationships, and remained highly significant even after con-
trolling for these relationships (z=6.8, degrees of freedom (d.f.)=800,
P<1×10
−11
; Extended Data Table2), conceptually replicating pre-
vious studies
7,11,14,15
.
The moralizing gods hypothesis posits a ‘statistical causal relation-
ship’
10
in which moralizing gods facilitate the evolution of complex
societies
9,12,16–18
. This indicates that, on average, social complexity
should increase more rapidly following the appearance of moralizing
gods. To test this prediction, we conducted time-series analyses of the
12regions for which social complexity data were available both before
and after the appearance of moralizing gods (Fig.2, Extended Data
Table1 and Extended Data Fig.1). Notably, average rates of increase
of social complexity were over five times greater before—not after—
the appearance of moralizing gods (paired t-test, t=−6.6, d.f.=199,
P<1×10
−9
; Fig.2). This trend was significant both globally and
individually for 10 out of the 12regional time-series analyses (Extended
Data Table1 and Extended Data Fig.1). None of these 12regions dis-
played a significantly greater rate of increase in social complexity after
the appearance of moralizing gods than before. Robustness analyses
showed that our primary finding of higher rates of increasing social
complexity before the appearance of moralizing gods was present
regardless of the type of moralizing gods (MHG or BSP), the choice of
variables used to estimate social complexity, uncertainty in the timing
of appearance of moralizing gods, or the time windows used to estimate
rates of change in social complexity (Extended Data Table4).
In summary, although our analyses are consistent with previous stud-
ies that show an association between moralizing gods and complex
societies
7,11,14–18,30
, we find that moralizing gods usually follow—rather
than precede—the rise of social complexity. Notably, most societies that
exceeded a certain social complexity threshold developed a conception
of moralizing gods. Specifically, in 10 out of the 12regions analysed,
the transition to moralizing gods came within 100years of exceeding a
social complexity value of 0.6 (which we call a megasociety, as it corre-
sponds roughly to a population in the order of one million; Extended
Data Fig.1). This megasociety threshold does not seem to correspond
to the point at which societies develop writing, which might have sug-
gested that moralizing gods were present earlier but were not preserved
archaeologically. Although we cannot rule out this possibility, the fact
that written records preceded the development of moralizing gods in
9 out of the 12regions analysed (by an average period of 400years;
Supplementary Table2)—combined with the fact that evidence for
moralizing gods is lacking in the majority of non-literate societies
2
—
suggests that such beliefs were not widespread before the invention
of writing. The few small-scale societies that did display precolonial
evidence of moralizing gods came from regions that had previously
been used to support the claim that moralizing gods contributed to the
rise of social complexity (Austronesia
16
and Iceland
18
), whichsuggests
that such regions are the exception rather than the rule.
Conversely, of the societies in the tenregions that did not develop
precolonial moralizing gods, only one exceeded the megasociety
threshold (the short-lived Inca Empire, social complexity=0.61).
This suggests that, even if moralizing gods do not cause the evolution
of complex societies, they may represent a cultural adaptation that is
necessary to maintain cooperation in such societies once they have
exceeded a certain size, perhaps owing to the need to subject diverse
populations in multi-ethnic empires to a common higher-level power
9
.
This may explain why moralizing gods spread when large empires con-
quer smaller—but still complex—societies (for example, the Spanish
conquest of the Incas). In some cases, moralizing doctrines may have
helped to stabilize empires, while also limiting further expansion;for
example, when emperor Ashoka adopted Buddhism and renounced
war following his final conquest of the Kalinga Kingdom that estab-
lished the maximum extent of the Mauryan empire.
Although our results do not support the view that moralizing gods
were necessary for the rise of complex societies, they also do not
support a leading alternative hypothesis that moralizing gods only
emerged as a byproduct of a sudden increase in affluence during a
first millennium ‘Axial Age’
19–22
. Instead, in three of our regions
(Egypt, Mesopotamia and Anatolia), moralizing gods appeared before
1500. We propose thatthe standardization of beliefs and practices
via high-frequency repetition and enforcement by religious authorities
enabled the unification of large populations for the first time, establish-
ing common identities across states and empires
25,26
. Our data show
that doctrinal rituals standardized by routinization (that is, those per-
formed weekly or daily) or institutionalized policing (religions with
multiple hierarchical levels) significantly predate moralizing gods, by an
average of 1,100years (t=2.8, d.f.=11, P=0.018; Fig.2a). Doctrinal
rituals precede moralizing gods in 9 out of the 12regions analysed,
and even precede written records in 6 of these cases (by as much as
4,000years in the case of Çatalhöyük in Anatolia; see Supplementary
Table2). Although analyses of rates of change of social complex-
ity before and after the appearance of doctrinal rituals do not offer
conclusive support for the hypothesis that doctrinal rituals facilitate
increasing social complexity (Extended Data Table3), these data do at
least suggest that doctrinal rituals led to the establishment of large-scale
religious identities. In the future, higher-quality and higher-resolution
archaeological data may allow for a more nuanced understanding of
the timing and possible coevolution of the rise of doctrinal rituals and
moralizing gods. Such data appear unlikely to affect our primary claim
that complex societies preceded moralizing gods, but this is an empiri-
cal question open to future testing.
We demonstrate how quantifying cultural characteristics of past
societies can contribute to longstanding debates about the evolution
of social complexity. Our results suggest that belief in moralizing gods
was not the only or even the main factor that enabled the expansion
of human societies, but may have occurred along with other features
of ritual practices and religion to facilitate cooperation in increasingly
complex social systems. In particular, an increase in ritual frequency
and doctrinal control may have facilitated the establishment of large-
scale collective identities before the spread of beliefs in moralizing
gods. Thus, when it comes to the initial rise of social complexity, how
you worship may ultimately have been more important than who you
worship.
Abstract: The origins of religion and of complex societies represent evolutionary puzzles1–8. The ‘moralizing gods’ hypothesis offers a solution to both puzzles by proposing that belief in morally concerned supernatural agents culturally evolved to facilitate cooperation among strangers in large-scale societies9–13. Although previous research has suggested an association between the presence of moralizing gods and social complexity3,6,7,9–18, the relationship between the two is disputed9–13,19–24, and attempts to establish causality have been hampered by limitations in the availability of detailed global longitudinal data. To overcome these limitations, here we systematically coded records from 414 societies that span the past 10,000 years from 30 regions around the world, using 51 measures of social complexity and 4 measures of supernatural enforcement of morality. Our analyses not only confirm the association between moralizing gods and social complexity, but also reveal that moralizing gods follow—rather than precede—large increases in social complexity. Contrary to previous predictions9,12,16,18, powerful moralizing ‘big gods’ and prosocial supernatural punishment tend to appear only after the emergence of ‘megasocieties’ with populations of more than around one million people. Moralizing gods are not a prerequisite for the evolution of social complexity, but they may help to sustain and expand complex multi-ethnic empires after they have become established. By contrast, rituals that facilitate the standardization of religious traditions across large populations25,26 generally precede the appearance of moralizing gods. This suggests that ritual practices were more important than the particular content of religious belief to the initial rise of social complexity.
---
Supernatural agents that punish direct affronts to themselves (for
example, failure to perform sacrifices or observe taboos) are commonly
represented in global history, but rarely are such deities believed to
punish moral violations in interactions between humans
2
. Recent mil-
lennia, however, have seen the rise and spread of several ‘prosocial
religions’, which include either powerful ‘moralizing high gods’ (MHG;
for example, the Abrahamic God) or more general ‘broad supernatu-
ral punishment’ (BSP) of moral transgressions (for example, karma in
Buddhism)
9,12,16–18
. Such moralizing gods may have provided a crucial
mechanism for overcoming the classic free-rider problem in large-scale
societies
11
. The association between moralizing gods and complex
societies has been supported by two forms of evidence: psychological
experiments
3,6,27,28
and cross-cultural comparative analyses
7,11,14–18,20
.
The contributions of theistic beliefs to cooperation, as well as the his-
torical question of whether moralizing gods precede or follow the estab-
lishment of large-scale cooperation, have been much debated
9,10,12,23,24
.
Three recent studies that explicitly model temporal causality have come
to contrasting conclusions. One study, which applied phylogenetic
comparative methods to infer historical changes in Austronesian reli-
gions, reported that moralizing gods (BSP but not MHG) preceded the
evolution of complex societies
16
. The same conclusion was reached
in an analysis of historical and archaeological data from Viking-age
Scandinavia
18
. By contrast, another study of Eurasian empires has
reported that moralizing gods followed—rather than preceded—the
rise of complex, affluent societies
20
. However, all of these studies are
restricted in geographical scope and use proxies for social complexity
that the authors themselves concede are ‘very crude’
20
(for example,
thebinary classification of societies as ofeither high or low complexity).
To overcome these limitations, we used ‘Seshat: Global History
Databank’
29
,a repository of standardized data on social structure, reli-
gion and other domains for hundreds of societies throughout world his-
tory. In contrast to other databases that attempt to model history using
contemporary ethnographic data, Seshat directly samples over time as
well as space. Seshat also includes estimates of expert disagreement and
uncertainty, and uses more-detailed variables than many databases.
To test the moralizing gods hypothesis, we coded data on 55var-
iables from 414polities (independent political units) that occupied
30geographical regions from the beginning of the Neolithic period
to the beginning of Industrial and/or colonial periods (Fig.1 and
Supplementary Data). We used a recently developed and validated
measure of social complexity thatcondenses 51social complexity
variables (Extended Data Table5) into a single principal component
thatcaptures three quarters of the observed variation, which we call
‘social complexity’
8
. The remaining four variables were selected to test
the MHG and BSP subtypes of the moralizing gods hypothesis. The
MHG variable was coded following the MHG variable used as stand-
ardin the literature on this topic
11,14–17,30
, whichrequires that a high
god who created and/or governs the cosmos actively enforces human
morality. Because the concept of morality is complex, multidimensional
and in some respects culturally relative—and because not all moralizing
gods are ‘high gods’—we also coded three different variables related to
BSP that are specifically relevant to prosocial cooperation: reciprocity,
fairness and in-group loyalty. For analysis, these three variables were
combined into a single BSP variable. TheMethods, Supplementary
Information and http://seshatdatabank.info/methods/codebook pro-
vide further methodological details, definitions and justifications,
including adiscussion of the relationship between MHG, BSP and big
gods.
Figure1 and Extended Data Table1 show the temporal and geo-
graphical distribution of the appearance of moralizing gods in our
sample. Although societies in all 30regions possessed beliefs about
appeasing supernatural agents through the performance of rituals, in 10
out of the 30regions, there was no evidence for moralizing gods before
their introduction under colonial powers. The remaining 20regions displayed a diverse range of 15different systems of belief in moraliz-
ing gods: in some, the first evidence of moralizing gods came in the
form of MHG and in others it came in the form of BSP (Extended
Data Table1). The first appearance of moralizing gods in our sample
was in Egypt, where the concept of supernatural enforcement of
Maat (order) is attested by the Second Dynasty, around 2800. This
was followed by sporadic appearances in local religions throughout
Eurasia (Mesopotamia (around 2200), Anatolia (around 1500)
and China (around 1000)) before the wider spread of transnational
religions began during the first millennium with Zoroastrianism
and Buddhism, followed later by Christianity and Islam. Although
Christianity and Islam would eventually become the most widespread
religions, local forms of moralizing gods were present well before they
arrived in most regions (for example, Roman gods were believed to
punish oath-breaking from as early as 500, almost a millennium
before Christianity was adopted as the official Roman religion). The
diverse range of religious systems represented in our global sample
makes it possible to draw more general conclusions about religion than
have previously been possible.
Although our sampling scheme reduces non-independence, our
polities still cannot be considered statistically independent because
of the historical relationships among them. We controlled for these
using a logistic regression model to account for temporal, geographi-
cal and cultural dependencies in the global distribution of moralizing
gods (seeMethods). This analysis revealed that social complexity was
a stronger predictor of moralizing gods than temporal, geographical or
linguistic relationships, and remained highly significant even after con-
trolling for these relationships (z=6.8, degrees of freedom (d.f.)=800,
P<1×10
−11
; Extended Data Table2), conceptually replicating pre-
vious studies
7,11,14,15
.
The moralizing gods hypothesis posits a ‘statistical causal relation-
ship’
10
in which moralizing gods facilitate the evolution of complex
societies
9,12,16–18
. This indicates that, on average, social complexity
should increase more rapidly following the appearance of moralizing
gods. To test this prediction, we conducted time-series analyses of the
12regions for which social complexity data were available both before
and after the appearance of moralizing gods (Fig.2, Extended Data
Table1 and Extended Data Fig.1). Notably, average rates of increase
of social complexity were over five times greater before—not after—
the appearance of moralizing gods (paired t-test, t=−6.6, d.f.=199,
P<1×10
−9
; Fig.2). This trend was significant both globally and
individually for 10 out of the 12regional time-series analyses (Extended
Data Table1 and Extended Data Fig.1). None of these 12regions dis-
played a significantly greater rate of increase in social complexity after
the appearance of moralizing gods than before. Robustness analyses
showed that our primary finding of higher rates of increasing social
complexity before the appearance of moralizing gods was present
regardless of the type of moralizing gods (MHG or BSP), the choice of
variables used to estimate social complexity, uncertainty in the timing
of appearance of moralizing gods, or the time windows used to estimate
rates of change in social complexity (Extended Data Table4).
In summary, although our analyses are consistent with previous stud-
ies that show an association between moralizing gods and complex
societies
7,11,14–18,30
, we find that moralizing gods usually follow—rather
than precede—the rise of social complexity. Notably, most societies that
exceeded a certain social complexity threshold developed a conception
of moralizing gods. Specifically, in 10 out of the 12regions analysed,
the transition to moralizing gods came within 100years of exceeding a
social complexity value of 0.6 (which we call a megasociety, as it corre-
sponds roughly to a population in the order of one million; Extended
Data Fig.1). This megasociety threshold does not seem to correspond
to the point at which societies develop writing, which might have sug-
gested that moralizing gods were present earlier but were not preserved
archaeologically. Although we cannot rule out this possibility, the fact
that written records preceded the development of moralizing gods in
9 out of the 12regions analysed (by an average period of 400years;
Supplementary Table2)—combined with the fact that evidence for
moralizing gods is lacking in the majority of non-literate societies
2
—
suggests that such beliefs were not widespread before the invention
of writing. The few small-scale societies that did display precolonial
evidence of moralizing gods came from regions that had previously
been used to support the claim that moralizing gods contributed to the
rise of social complexity (Austronesia
16
and Iceland
18
), whichsuggests
that such regions are the exception rather than the rule.
Conversely, of the societies in the tenregions that did not develop
precolonial moralizing gods, only one exceeded the megasociety
threshold (the short-lived Inca Empire, social complexity=0.61).
This suggests that, even if moralizing gods do not cause the evolution
of complex societies, they may represent a cultural adaptation that is
necessary to maintain cooperation in such societies once they have
exceeded a certain size, perhaps owing to the need to subject diverse
populations in multi-ethnic empires to a common higher-level power
9
.
This may explain why moralizing gods spread when large empires con-
quer smaller—but still complex—societies (for example, the Spanish
conquest of the Incas). In some cases, moralizing doctrines may have
helped to stabilize empires, while also limiting further expansion;for
example, when emperor Ashoka adopted Buddhism and renounced
war following his final conquest of the Kalinga Kingdom that estab-
lished the maximum extent of the Mauryan empire.
Although our results do not support the view that moralizing gods
were necessary for the rise of complex societies, they also do not
support a leading alternative hypothesis that moralizing gods only
emerged as a byproduct of a sudden increase in affluence during a
first millennium ‘Axial Age’
19–22
. Instead, in three of our regions
(Egypt, Mesopotamia and Anatolia), moralizing gods appeared before
1500. We propose thatthe standardization of beliefs and practices
via high-frequency repetition and enforcement by religious authorities
enabled the unification of large populations for the first time, establish-
ing common identities across states and empires
25,26
. Our data show
that doctrinal rituals standardized by routinization (that is, those per-
formed weekly or daily) or institutionalized policing (religions with
multiple hierarchical levels) significantly predate moralizing gods, by an
average of 1,100years (t=2.8, d.f.=11, P=0.018; Fig.2a). Doctrinal
rituals precede moralizing gods in 9 out of the 12regions analysed,
and even precede written records in 6 of these cases (by as much as
4,000years in the case of Çatalhöyük in Anatolia; see Supplementary
Table2). Although analyses of rates of change of social complex-
ity before and after the appearance of doctrinal rituals do not offer
conclusive support for the hypothesis that doctrinal rituals facilitate
increasing social complexity (Extended Data Table3), these data do at
least suggest that doctrinal rituals led to the establishment of large-scale
religious identities. In the future, higher-quality and higher-resolution
archaeological data may allow for a more nuanced understanding of
the timing and possible coevolution of the rise of doctrinal rituals and
moralizing gods. Such data appear unlikely to affect our primary claim
that complex societies preceded moralizing gods, but this is an empiri-
cal question open to future testing.
We demonstrate how quantifying cultural characteristics of past
societies can contribute to longstanding debates about the evolution
of social complexity. Our results suggest that belief in moralizing gods
was not the only or even the main factor that enabled the expansion
of human societies, but may have occurred along with other features
of ritual practices and religion to facilitate cooperation in increasingly
complex social systems. In particular, an increase in ritual frequency
and doctrinal control may have facilitated the establishment of large-
scale collective identities before the spread of beliefs in moralizing
gods. Thus, when it comes to the initial rise of social complexity, how
you worship may ultimately have been more important than who you
worship.
Why the Days Seem Shorter as We Get Older: The ‘mind time’ is a sequence of images; the rate at which changes in mental images are perceived decreases with age
Why the Days Seem Shorter as We Get Older. Adrian Bejan. European Review, Mar 18 2019. https://doi.org/10.1017/S1062798718000741
Abstract: Why does it feel that the time passes faster as we get older? What is the physical basis for the impression that some days are slower than others? Why do we tend to focus on the unusual (the surprise), not on the ever present? This article unveils the physics basis for these common observations. The reason is that the measurable ‘clock time’ is not the same as the time perceived by the human mind. The ‘mind time’ is a sequence of images, i.e. reflections of nature that are fed by stimuli from sensory organs. The rate at which changes in mental images are perceived decreases with age, because of several physical features that change with age: saccades frequency, body size, pathways degradation, etc. The misalignment between mental-image time and clock time serves to unite the voluminous observations of this phenomenon in the literature with the constructal law of evolution of flow architecture, as physics.
---
Perceptions
Among the most common human perceptions is that time passes faster as an indivi-dual becomes older. The days become shorter, and so do the years. We all have storiesof this kind, from the long days of childhood and the never-ending class hours inelementary school, to days, months and years that now pass in a blur. The mostcommon sayings convey this impression: Timesflies; Where did the time go?; Lastyear was yesterday; Growing up took forever; A watched pot never boils; etc.More subtle, and worth questioning is the impression that some days appear to passmore slowly than others. The‘slower’days are full of productivity, events, and mem-ories of what happened. If you did not notice this difference between slow days and fastdays, then you should pay attention to it, because in this difference lies the explanationfor the lifelong puzzle sketched in the preceding paragraph. The hint is that productivedays happen when the body and mind are rested, after periods of regular sleep, when inthe morning you look in the mirror and you see a younger you, not a tired you.Athletes learn the hard way the correlation between good rest and the speed of thepassing time. Lack of rest makes you miss plays, unable to anticipate, unable to seethe ball before it arrives. While sleep walking, the game is over before you know it.
Young students learn the same physical truth while taking exams during afixedtime interval. The rested mind has more time to go through the problems, tofindmistakes, to go back to the beginning, and try again. Lack of sleep, due to crammingthe night before the exam, makes the time pass faster during the exam period.Cramming does not pay, but rest does, which is why the good coach rests the teambefore the big game.Here is why this is important to you, the reader. Today, many young peopleexperience time distortion because they spend too much time on social media. Thishas serious consequences, ranging from sleep deprivation to mood changes andmental disorder. This is why an understanding of the physics basis of how humansperceive the passing of time is essential.PhysicsTime represents perceived changes in stimuli (observed facts), such as visual images.1,2The human mind perceives reality (nature, physics) through images that occur as visualinputs reach the cortex. The mind senses‘time change’when the perceived image chan-ges. The time arrow in physics is the goal-oriented sequence of changes inflow config-uration, the direction dictated by the constructal law.1-11The present is different from thepast because the mental viewing has changed, not because somebody’s clock rings.The‘clock time’that unites all the liveflow systems, animate and inanimate, ismeasurable. The day–night period lasts 24 hours on allwatches, wall clocks and belltowers. Yet, physical time is not mind time. The time that you perceive is not thesame as the time perceived by another. Why? Because the young mind receives moreimages during one day than the same mind in old age. Said another way, if thelifespan is measured in terms of the number of images perceived during life, then thefrequency of mental images at young age is greater than in old age (Figure 1). Hereis why this should be:
Figure 1.The misalignment between perceived time and clock time during lifetime.
The sensory inputs that travel into the human body to become mental images–‘reflections’of reality in the human mind–are intermittent. They occur at certaintime intervals (t1), and must travel the body length scale (L) with a certain speed (V).In the case of vision,t1is the time interval between successive saccades. The timerequired by one mental image to travel from a sensory organ to the cortex is of ordert2~L/V. During life, the body length scale (L) increases in proportion with the bodymassMraised to the power 1/3, and, like all growth phenomena, the body massincreases over time in S-curve fashion,12monotonically, slow–fast–slow, cf.Figure 2.
Figure 2.All growth phenomena (spreading, collecting) exhibit an S-shaped historycurve8: fourflow systems where the size of theflow space increases monotonically,slow–fast–slow.
Figure 3.The length of theflow path increases as the body size and complexityincrease.9
The length traveled by inputs from external sensors to the cortex is actually greaterthanL, and it increases with age. The reason is that the complexity of theflow pathneeded by one signal to reach one point on the cortex increases as the brain grows andthe complexity of the tree-shapedflow paths increase, cf. Figure 3.13The broad trend then is thatLincreases with age. At the same time,Vdecreasesbecause of the ageing (degradation) of theflow paths. The key feature is that thephysical time (the combined effect oft1andt2) required by the occurrence of one mentalimage increases monotonically during the life of the individual. The frequency ofmental images decreases monotonically, and non-uniformly (i.e. not at constant rate).This trend is illustrated qualitatively in Figure 1. Two summarizing conclusions follow.(i) More of the recorded mental images should be from youth.(ii) The‘speed’of the time perceived by the human mind should increaseover life. The rate at which the physical time clock‘ticks’during onechange in the mental image increases with age.ReviewThe misalignment of the clock ticks and the changes perceived by the mind (Figure 1)brings together numerous observations and measurements accumulated in the lit-erature, especially in the study of vision and cognition.First, to define the terms, Fischer and Weber explain that during natural viewingconditions a normal adult subject makes 3–5 saccades in a second separated by per-iods of 200-300 ms during which the eyes do not make large or fast movements.14These periods are usually called‘fixations’. If the retinal image, as a whole, is pre-vented from moving (by successful voluntary attempts not to move the eyes, or bytechnical means), vision rapidly becomes blurred and the perception of the retinalimage fades away completely within 10 seconds. Fischer and Weber14explain that thehighly inhomogeneous structure of the primate retina, with an extremely high densityof receptor and ganglion cells in the center, a specialized fovea, and a rapid decline ofthe cell densities toward the periphery, makes it almost impossible to have a homogeneous and simultaneous percept of the total visualfield without somehowmoving the fovea to different positions and acquiring and integrating informationfrom these successive‘looks’. The existence of a fovea requires both eye movementsand periods offixation, that is, the active suppression of saccadic eye movements.Although the reaction times of saccades is relatively stable (200–250 ms),15theinfantfixation times are shorter than in adults. In primates, there is a constant rela-tionship between the duration, peak velocity and amplitude of saccadic eye move-ment,16–18known as the‘main sequence’, in which saccade trajectories have evolvedtoward optimizing the trade-off between accuracy and duration (speed) of the eyemovement. This is also in accord with the physics basis for the human preference fordisplays shaped in‘golden-ratio’rectangular frames, which is the shape that isscanned the fastest by the two human eyes.2As a result of an interaction betweenafferent, central and efferent neural processes we perceive a complete and stablevisualfield, which can serve as a frame within which we see motion and within whichwe move ourselves or parts of our body.14Bahill and Stark showed that fatigue can produce overlapping saccades in whichthe high-frequency saccadic bursts should show large pauses, glissades in which thehigh-frequency bursts should be much shorter than appropriate for the size of theintended saccades, and low-velocity, long-duration, non-Main Sequence saccades inwhich the mononeuronal bursts should be of lower frequency and longer durationthan normal.19When the saccadic eye movement system fatigues, saccades becomeslower, and the neurological control signal stratagem changes. The term fatigue isused in a broad sense, as it was by McFarland:‘a group of phenomena associatedwith impairment, or loss, of efficiency and skill’.20The intuitive view that the world is processed as a seamless stream of ongoingperception has been challenged in the current literature. Herzoget al. discussedexperimental evidence supporting the view that perception might be discrete, furthersupporting evidence for discrete theories.21Visual information processing is similarto a sample and hold mechanism in engineering, as in analog/digital converters.Herzoget al. also noted that the brain functions such that we consciously perceiveonly the most plausible solution, and not a confusing manifold of possibilities thatoccur during unconscious processing.21The unconscious feature integration period isthe period of sense-making. The discrete conscious perception is followed byunconscious processing over time. These two modes of absorbing inputs from thesurroundings are analogous to all otherflows from point (e.g. eye) to volume (e.g.brain).1Observing fast and then letting it sink in slowly, is the same dynamicflowdesign as the long and fast, and short and slow that inhabits all nature,1animate andinanimate. Conscious perception and the unconscious processing that follows are the‘invasion’and‘consolidation’phases of the universal S-curve phenomenon.12Cicchiniet al. review the classical model of time perception, which considers asingle centralized clock that ticks at a constant rate.22They point out that muchexperimental evidence seems to cast doubt on this model. The ability to pay‘atten-tion’could modulate the tick rate and hence the duration of the events.23Manystudies found that the most surprising stimulus within a train of events is perceived longer, probably because it engages more transient attention or because the event isless predictable. Brunoet al. show that the apparent duration of moving visual objectsis greater at higher than at lower speeds.24Pöppelet al. argued that cognitive processes cannot be understood without theirtemporal dynamics; furthermore, certain logistical problems the brain has to deal withrequire an understanding of temporal processing.25Eaglemanet al. discussed theflash-lag illusion, where aflash and a moving object in the same location appear to beoffset.26They proposed an alternative in which visual awareness is neither predictivenor online but is postdictive, so that the percept attributed to the time of theflash is afunction of events that happen in the ~ 80 ms after theflash. Interpolation of the past isthe only framework that provides a unified explanation for theflash-lag phenomenon.VanRullen and Koch reconciled the unduly abandoned topic of discrete percep-tion with current views and advances in neuroscience.27Hainlineet al. showed thatfor both infants and adults, linear relationships were found between the peak velo-cities of fast eye movements and their amplitudes (main sequences).28With regard to the effect of aging, Sharpe and Zackon investigated horizontalsaccades in young, middle-aged and elderly normal subjects.29Saccades were elicitedin response to three target conditions: predictable amplitude direct and timing;unpredictable amplitudes and directions at regular intervals; and unpredictably timedtargets of predictable amplitude and direction. Peak velocities were significantlyreduced in the elderly when target amplitude and direction were predictable. Latencieswere prolonged in the elderly under all conditions. Saccadic accuracy was significantlydecreased in elderly subjects. Support for the thesis that age-related cognitive slowing isglobal is provided by Myersonet al.30Although older adults perform worse thanyounger adults in complex decision-making scenarios, prior experience should be takeninto account in aging studies.31
Conclusion
Summing up, we conclude that the perceived misalignment between mental-imagetime and clock time (Figure 1) is in accord with and unifies the growing number ofobservations that describe aspects of this phenomenon in the literature. The physicsbasis is captured by the constructal law of evolution in nature (see Figures 2 and 3).
Acknowledgement
Professor Adrian Bejan’s research was supported by the US National ScienceFoundation.
Abstract: Why does it feel that the time passes faster as we get older? What is the physical basis for the impression that some days are slower than others? Why do we tend to focus on the unusual (the surprise), not on the ever present? This article unveils the physics basis for these common observations. The reason is that the measurable ‘clock time’ is not the same as the time perceived by the human mind. The ‘mind time’ is a sequence of images, i.e. reflections of nature that are fed by stimuli from sensory organs. The rate at which changes in mental images are perceived decreases with age, because of several physical features that change with age: saccades frequency, body size, pathways degradation, etc. The misalignment between mental-image time and clock time serves to unite the voluminous observations of this phenomenon in the literature with the constructal law of evolution of flow architecture, as physics.
---
Perceptions
Among the most common human perceptions is that time passes faster as an indivi-dual becomes older. The days become shorter, and so do the years. We all have storiesof this kind, from the long days of childhood and the never-ending class hours inelementary school, to days, months and years that now pass in a blur. The mostcommon sayings convey this impression: Timesflies; Where did the time go?; Lastyear was yesterday; Growing up took forever; A watched pot never boils; etc.More subtle, and worth questioning is the impression that some days appear to passmore slowly than others. The‘slower’days are full of productivity, events, and mem-ories of what happened. If you did not notice this difference between slow days and fastdays, then you should pay attention to it, because in this difference lies the explanationfor the lifelong puzzle sketched in the preceding paragraph. The hint is that productivedays happen when the body and mind are rested, after periods of regular sleep, when inthe morning you look in the mirror and you see a younger you, not a tired you.Athletes learn the hard way the correlation between good rest and the speed of thepassing time. Lack of rest makes you miss plays, unable to anticipate, unable to seethe ball before it arrives. While sleep walking, the game is over before you know it.
Young students learn the same physical truth while taking exams during afixedtime interval. The rested mind has more time to go through the problems, tofindmistakes, to go back to the beginning, and try again. Lack of sleep, due to crammingthe night before the exam, makes the time pass faster during the exam period.Cramming does not pay, but rest does, which is why the good coach rests the teambefore the big game.Here is why this is important to you, the reader. Today, many young peopleexperience time distortion because they spend too much time on social media. Thishas serious consequences, ranging from sleep deprivation to mood changes andmental disorder. This is why an understanding of the physics basis of how humansperceive the passing of time is essential.PhysicsTime represents perceived changes in stimuli (observed facts), such as visual images.1,2The human mind perceives reality (nature, physics) through images that occur as visualinputs reach the cortex. The mind senses‘time change’when the perceived image chan-ges. The time arrow in physics is the goal-oriented sequence of changes inflow config-uration, the direction dictated by the constructal law.1-11The present is different from thepast because the mental viewing has changed, not because somebody’s clock rings.The‘clock time’that unites all the liveflow systems, animate and inanimate, ismeasurable. The day–night period lasts 24 hours on allwatches, wall clocks and belltowers. Yet, physical time is not mind time. The time that you perceive is not thesame as the time perceived by another. Why? Because the young mind receives moreimages during one day than the same mind in old age. Said another way, if thelifespan is measured in terms of the number of images perceived during life, then thefrequency of mental images at young age is greater than in old age (Figure 1). Hereis why this should be:
Figure 1.The misalignment between perceived time and clock time during lifetime.
The sensory inputs that travel into the human body to become mental images–‘reflections’of reality in the human mind–are intermittent. They occur at certaintime intervals (t1), and must travel the body length scale (L) with a certain speed (V).In the case of vision,t1is the time interval between successive saccades. The timerequired by one mental image to travel from a sensory organ to the cortex is of ordert2~L/V. During life, the body length scale (L) increases in proportion with the bodymassMraised to the power 1/3, and, like all growth phenomena, the body massincreases over time in S-curve fashion,12monotonically, slow–fast–slow, cf.Figure 2.
Figure 2.All growth phenomena (spreading, collecting) exhibit an S-shaped historycurve8: fourflow systems where the size of theflow space increases monotonically,slow–fast–slow.
Figure 3.The length of theflow path increases as the body size and complexityincrease.9
The length traveled by inputs from external sensors to the cortex is actually greaterthanL, and it increases with age. The reason is that the complexity of theflow pathneeded by one signal to reach one point on the cortex increases as the brain grows andthe complexity of the tree-shapedflow paths increase, cf. Figure 3.13The broad trend then is thatLincreases with age. At the same time,Vdecreasesbecause of the ageing (degradation) of theflow paths. The key feature is that thephysical time (the combined effect oft1andt2) required by the occurrence of one mentalimage increases monotonically during the life of the individual. The frequency ofmental images decreases monotonically, and non-uniformly (i.e. not at constant rate).This trend is illustrated qualitatively in Figure 1. Two summarizing conclusions follow.(i) More of the recorded mental images should be from youth.(ii) The‘speed’of the time perceived by the human mind should increaseover life. The rate at which the physical time clock‘ticks’during onechange in the mental image increases with age.ReviewThe misalignment of the clock ticks and the changes perceived by the mind (Figure 1)brings together numerous observations and measurements accumulated in the lit-erature, especially in the study of vision and cognition.First, to define the terms, Fischer and Weber explain that during natural viewingconditions a normal adult subject makes 3–5 saccades in a second separated by per-iods of 200-300 ms during which the eyes do not make large or fast movements.14These periods are usually called‘fixations’. If the retinal image, as a whole, is pre-vented from moving (by successful voluntary attempts not to move the eyes, or bytechnical means), vision rapidly becomes blurred and the perception of the retinalimage fades away completely within 10 seconds. Fischer and Weber14explain that thehighly inhomogeneous structure of the primate retina, with an extremely high densityof receptor and ganglion cells in the center, a specialized fovea, and a rapid decline ofthe cell densities toward the periphery, makes it almost impossible to have a homogeneous and simultaneous percept of the total visualfield without somehowmoving the fovea to different positions and acquiring and integrating informationfrom these successive‘looks’. The existence of a fovea requires both eye movementsand periods offixation, that is, the active suppression of saccadic eye movements.Although the reaction times of saccades is relatively stable (200–250 ms),15theinfantfixation times are shorter than in adults. In primates, there is a constant rela-tionship between the duration, peak velocity and amplitude of saccadic eye move-ment,16–18known as the‘main sequence’, in which saccade trajectories have evolvedtoward optimizing the trade-off between accuracy and duration (speed) of the eyemovement. This is also in accord with the physics basis for the human preference fordisplays shaped in‘golden-ratio’rectangular frames, which is the shape that isscanned the fastest by the two human eyes.2As a result of an interaction betweenafferent, central and efferent neural processes we perceive a complete and stablevisualfield, which can serve as a frame within which we see motion and within whichwe move ourselves or parts of our body.14Bahill and Stark showed that fatigue can produce overlapping saccades in whichthe high-frequency saccadic bursts should show large pauses, glissades in which thehigh-frequency bursts should be much shorter than appropriate for the size of theintended saccades, and low-velocity, long-duration, non-Main Sequence saccades inwhich the mononeuronal bursts should be of lower frequency and longer durationthan normal.19When the saccadic eye movement system fatigues, saccades becomeslower, and the neurological control signal stratagem changes. The term fatigue isused in a broad sense, as it was by McFarland:‘a group of phenomena associatedwith impairment, or loss, of efficiency and skill’.20The intuitive view that the world is processed as a seamless stream of ongoingperception has been challenged in the current literature. Herzoget al. discussedexperimental evidence supporting the view that perception might be discrete, furthersupporting evidence for discrete theories.21Visual information processing is similarto a sample and hold mechanism in engineering, as in analog/digital converters.Herzoget al. also noted that the brain functions such that we consciously perceiveonly the most plausible solution, and not a confusing manifold of possibilities thatoccur during unconscious processing.21The unconscious feature integration period isthe period of sense-making. The discrete conscious perception is followed byunconscious processing over time. These two modes of absorbing inputs from thesurroundings are analogous to all otherflows from point (e.g. eye) to volume (e.g.brain).1Observing fast and then letting it sink in slowly, is the same dynamicflowdesign as the long and fast, and short and slow that inhabits all nature,1animate andinanimate. Conscious perception and the unconscious processing that follows are the‘invasion’and‘consolidation’phases of the universal S-curve phenomenon.12Cicchiniet al. review the classical model of time perception, which considers asingle centralized clock that ticks at a constant rate.22They point out that muchexperimental evidence seems to cast doubt on this model. The ability to pay‘atten-tion’could modulate the tick rate and hence the duration of the events.23Manystudies found that the most surprising stimulus within a train of events is perceived longer, probably because it engages more transient attention or because the event isless predictable. Brunoet al. show that the apparent duration of moving visual objectsis greater at higher than at lower speeds.24Pöppelet al. argued that cognitive processes cannot be understood without theirtemporal dynamics; furthermore, certain logistical problems the brain has to deal withrequire an understanding of temporal processing.25Eaglemanet al. discussed theflash-lag illusion, where aflash and a moving object in the same location appear to beoffset.26They proposed an alternative in which visual awareness is neither predictivenor online but is postdictive, so that the percept attributed to the time of theflash is afunction of events that happen in the ~ 80 ms after theflash. Interpolation of the past isthe only framework that provides a unified explanation for theflash-lag phenomenon.VanRullen and Koch reconciled the unduly abandoned topic of discrete percep-tion with current views and advances in neuroscience.27Hainlineet al. showed thatfor both infants and adults, linear relationships were found between the peak velo-cities of fast eye movements and their amplitudes (main sequences).28With regard to the effect of aging, Sharpe and Zackon investigated horizontalsaccades in young, middle-aged and elderly normal subjects.29Saccades were elicitedin response to three target conditions: predictable amplitude direct and timing;unpredictable amplitudes and directions at regular intervals; and unpredictably timedtargets of predictable amplitude and direction. Peak velocities were significantlyreduced in the elderly when target amplitude and direction were predictable. Latencieswere prolonged in the elderly under all conditions. Saccadic accuracy was significantlydecreased in elderly subjects. Support for the thesis that age-related cognitive slowing isglobal is provided by Myersonet al.30Although older adults perform worse thanyounger adults in complex decision-making scenarios, prior experience should be takeninto account in aging studies.31
Conclusion
Summing up, we conclude that the perceived misalignment between mental-imagetime and clock time (Figure 1) is in accord with and unifies the growing number ofobservations that describe aspects of this phenomenon in the literature. The physicsbasis is captured by the constructal law of evolution in nature (see Figures 2 and 3).
Acknowledgement
Professor Adrian Bejan’s research was supported by the US National ScienceFoundation.
In rats, no sex difference found in reward-guided associative learning, but females were more sensitive to probabilistic punishment (but less sensitive when punishment could be avoided with certainty)
Sex differences in reward- and punishment-guided actions. Tara G. Chowdhury et al. bioRxiv, Mar 18 2019. https://doi.org/10.1101/581546
ABSTRACT: Differences in the prevalence and presentation of psychiatric illnesses in men and women suggest that neurobiological sex differences confer vulnerability or resilience in these disorders. Rodent behavioral models are critical for understanding the mechanisms of these differences. Reward processing and punishment avoidance are fundamental dimensions of the symptoms of psychiatric disorders. Here we explored sex differences along these dimensions using multiple and distinct behavioral paradigms. We found no sex difference in reward-guided associative learning but a faster punishment-avoidance learning in females. After learning, females were more sensitive than males to probabilistic punishment but less sensitive when punishment could be avoided with certainty. No sex differences were found in reward-guided cognitive flexibility. Thus, sex differences in goal-directed behaviors emerged selectively when there was an aversive context. These differences were critically sensitive to whether the punishment was certain or unpredictable. Our findings with these new paradigms provide conceptual and practical tools for investigating brain mechanisms that account for sex differences in susceptibility to anxiety and impulsivity. They may also provide insight for understanding the evolution of sex-specific optimal behavioral strategies in dynamic environments.
---
ABSTRACT: Differences in the prevalence and presentation of psychiatric illnesses in men and women suggest that neurobiological sex differences confer vulnerability or resilience in these disorders. Rodent behavioral models are critical for understanding the mechanisms of these differences. Reward processing and punishment avoidance are fundamental dimensions of the symptoms of psychiatric disorders. Here we explored sex differences along these dimensions using multiple and distinct behavioral paradigms. We found no sex difference in reward-guided associative learning but a faster punishment-avoidance learning in females. After learning, females were more sensitive than males to probabilistic punishment but less sensitive when punishment could be avoided with certainty. No sex differences were found in reward-guided cognitive flexibility. Thus, sex differences in goal-directed behaviors emerged selectively when there was an aversive context. These differences were critically sensitive to whether the punishment was certain or unpredictable. Our findings with these new paradigms provide conceptual and practical tools for investigating brain mechanisms that account for sex differences in susceptibility to anxiety and impulsivity. They may also provide insight for understanding the evolution of sex-specific optimal behavioral strategies in dynamic environments.
---
INTRODUCTION
Men
and women show different rates of diagnosis, symptomology, and
treatment responsivity in most brain disorders. For instance, men are
more commonly diagnosed with schizophrenia, four times more likely to
suffer from attention-deficit/hyperactivity disorder (ADHD) (Rowland, Lesesne, & Abramowitz, 2002), and twice as likely to be currently abusing illicit drugs (Abuse, 2013). On the other hand, major depressive disorder (MDD) and anxiety disorders are more common in women (Cyranowski, Frank, Young, & Shear, 2000). These patterns suggest that biological sex differences may underlie the disparities in vulnerability to these illnesses.
Reward-seeking
and punishment-avoidance behaviors are fundamental to motivation, and
are critical components of the pathophysiology of most psychiatric
illnesses. For example, studies have shown that patients with depressive
disorders (Pizzagalli et al., 2009), schizophrenia (Juckel, 2016; Kirsch, Ronshausen, Mier, & Gallhofer, 2007; Schlagenhauf et al., 2008), as well as ADHD (Scheres, Milham, Knutson, & Castellanos, 2007; Strohle et al., 2008)
have modified brain responses to reward. In particular, major
depression is associated with both a hyposensitivity to rewarding
stimuli and hypersensitivity to punishment, and associated imaging
findings are correlated with increasing frequency of depressive episodes
(Kumar et al., 2018).
Importantly, disorders where reward and aversive responding are
differently affected also show different incidences and symptom profiles
in women versus men. For example, in anxiety disorders there is an
increased avoidance of aversive stimuli that is more pronounced in women
(Sheynin et al., 2014).
Similarly, in MDD, men are more likely to exhibit symptoms of
aggression, substance abuse, and risky behavior, while females report
higher rates of sleep disturbance, stress, and anhedonia (L. A. Martin, Neighbors, & Griffith, 2013).
Similarly, it appears that sex influences substance abuse patterns,
with women more likely than men to cite stress as the reason for
initiating or relapsing drug use (Becker, McClellan, & Reed, 2017).
Finally,
in ADHD, females diagnosed with the disorder more frequently experience
comorbid anxiety and depression. These findings suggest that distinct
responses to punishment may explain some of the observed sex differences
in both the prevalence and expression of mental disorders.
Animal
models are critical for understanding the behavioral neuroscience of
sex differences, allowing assessment of motivated behaviors relevant to
psychiatric disorders. Animal behavioral models, however, have been
primarily developed and characterized with only male subjects.
Additionally, these models often assess reward and punishment
contingencies in separate tasks and constructs. In naturalistic
environments, appetitive and aversive contexts and outcomes are often
intertwined, and the ability to accurately assess and respond to these
conflicting situations is critical for optimal decision-making. Females
have been shown to respond differently than males to punishing stimuli (Denti & Epstein, 1972; Gruene, Flick, Stefano, Shea, & Shansky, 2015; Orsini, Willis, Gilbert, Bizon, & Setlow, 2016; Voulo & Parsons, 2017) as well as stress (Bangasser & Valentino, 2014; McEwen, 2014);
however, little is known about how sex differences translate to tasks
that involve both reward and punishment in the same behavioral series.
Here, we investigate sex differences in multiple rodent behavioral tasks
involving motivated behaviors (defined as behaviors where actions are
guided by action-outcome contingencies) that integrate reward-seeking
and punishment avoidance.
In the first task, the
Punishment Risk Task (PRT), a rewarded action was associated with an
escalating probability of punishment (Park & Moghaddam, 2017).
Thus, action-reward contingency was certain but different blocks in the
same behavioral series were associated with varying probability of
receiving a shock after action execution. This task assesses behavior in
response to unpredictability and perception of potential threats, which
is relevant to human models of anxiety (Cornwell, Garrido, Overstreet, Pine, & Grillon, 2017).
Human sex differences in prevalence and expression of anxiety-related
behaviors suggest that males and females may respond differently to
anxiety-provoking situations. This task, therefore, allowed us to
compare motivated actions in male and female rats within a context of
anxiety.
A second novel task, the Approach Avoid Task
(AAT) measured actions to seek reward or avoid punishment during the
same behavioral session. Rats were given simultaneous access to two
distinct actions (lever-press or nose-poke), and two discriminative
stimuli (a light or tone cue) signaled the trial type, either approach,
in which a specific action was reinforced with a pellet, or avoidance,
in which the other action prevented onset of a foot-shock. This task is
relevant to symptoms of brain disorders such as anxiety, substance
abuse, and MDD, which are associated with differences in processing of
rewarding and punishing contexts (Dombrovski, Szanto, Clark, Reynolds, & Siegle, 2013; McCabe, Woffindale, Harmer, & Cowen, 2012).
In addition to comparing learning and performance of male and female
rats in this task, we conducted two experiments to gauge the impact of
traditional models of anxiety in the context of this paradigm. These
included measuring (1) the dose-response effect of a pharmacological
model of anxiety (anxiogenic drug FG7142) on performance of the AAT and
(2) elevated plus maze (EPM) performance of animals that had undergone
training on the AAT compared to naïve animals.
Finally,
we characterized sex differences in reward-motivated tasks that required
behavioral inhibition and flexibility. Most disorders with higher
prevalence in men than women (autism, ADHD, schizophrenia) involve
impaired behavioral inhibition and flexibility. In addition, male
predisposition to compulsive behavior may contribute to the higher rates
of substance abuse and dependence in men [20]. To investigate these
constructs, we used two operant tests of cognitive flexibility: reversal
learning and extradimensional shifting. Both tasks required subjects to
update behavior in response to changing rules.
Subscribe to:
Posts (Atom)