Friday, April 26, 2024

Ethics and Human Behavioral Modernity

1. Introduction

It was Friedrich Nietzsche who highlighted that morality itself needs to be submitted to a critical scrutiny—his call for a revaluation of all values (Nietzsche, 2013). For such an endeavor, the fundamental question that must be addressed first is this: how did it come to be that there is such a thing as values? Why do human ethics exist at all?

It is not a theoretical question. In Nietzsche's day, the early biological history of humankind was still mostly unknown, and so to begin a genealogy of morals on the basis of ancient Greek and Hebrew culture, as Nietzsche did, must have seemed perfectly reasonable. But today, anthropologists possess a much fuller, and much longer, picture of the hominin timeline, and humans can trace their cultural and ethical sources to much earlier than classical civilizations (Quiros, 2012). Indeed, humans can trace their beginnings all the way back to when humans were not human in the modern sense of the word, when the species was still in fact purely animal (Klein, 2009). And since it is not generally considered a legitimate practice to apply ethical standards to wild animals, there must have been a time in human history when such standards could not have been applied to humans themselves, when there were no values to be valued—or to be revalued. Thus, the question needs to be asked again, non-theoretically: why do human ethics exist at all? What is it about the human transformation that has given birth to the practice of morality?

In attempting to explain the human turn towards behavioral modernity, scientists frequently resort to the notion of evolution—for instance, through such proposals as language genes and neural alterations (Klein, 2002; Pinker, 1994; Zwir et al., 2022). But these evolutionary explanations face serious challenges (Schlinger, 1996), including a lack of specificity and a need to fit a multitude of transformational events into an extremely narrow timeline. This essay will offer a more straightforward account of human behavioral modernity, outlining a depiction of modern humanity that can be directly observed today. This depiction underscores how humans have transitioned from once being purely animal to being still animal, but now with a significant and discernible addendum layered upon the entire species. This additional aspect of modern humanity can be denoted with the word construct, a word intended to consolidate the two indisputable categories of alteration that distinguish modern humans from their former purely animal selves:

  1. The artificial reconstruction of the human environment; and,
  2. The novel behaviors resulting from that artificial reconstruction.

Whereas humans were once animal and only animal, living in an entirely natural setting and displaying nothing but survival-and-procreative behaviors, humans today are both animal and construct, living in an environment that is more artificial than natural and displaying a mixture of behaviors that reflect both biological and synthetic roots. Therefore, modern humans and their activities are dual origined, a unique occurrence within the animal kingdom and a unique occurrence across evolutionary history.

Although this dual aspect of modern humanity has clearly had momentous impact, giving rise to the rich cultural tapestry currently experienced throughout twenty-first century civilization, it has also engendered an inevitable conflict, a conflict experienced by humans both interpersonally as well as internally. The two aspects of modern humanity, animal and construct, they are seldom in accord. The animal aspect of humanity is ancient in its origin, is grounded in immediacy, and gets its motivation from the self-serving needs of survival and procreation. The constructed aspect of humanity is extremely recent in its origin, requires delayed gratification, and receives its motivation from the desire for effective creation. The inherent conflict between these two aspects, distinctive to modern humans, is what gives birth to ethics and morality. Every ethical dilemma, at its core, comes down to the incompatibility between the animal and constructed sources of humankind, with each aspect striving to gain ascendency. Every moral quandary is ultimately a choice as to which influence is to be given the decisive sway, the beast within or the artificial structures all around.

This conflict has grown more intense over time and is now reaching a critical junction. The constructed aspect of humanity, once nonexistent, has continued to increase in both size and clout, and in many respects has become dominant over the species, including an authoritative claim upon morality itself. But as Nietzsche both recognized and railed against, an excessive suppression of humanity's animal nature brings unintended and unproductive consequences, including a loss of the vitality that has been spurring humanity onto its alternative course. Ultimately, for humans to continue to make progress on their unique biological path, they must find a way to reconcile and to transcend the inherent conflict between their two defining aspects, avoiding a return to complete animality and eschewing any acceptance of a complete artificiality.

 

2. Three Scenarios

First Scenario. A wild animal enters the territory of a mating pair of its own species. It surprises the male of the pair and attacks it, eventually killing it. The attacking animal then forces itself upon the female. It remains in the territory for the rest of the day and helps itself to the stashes of food the mating pair had gathered.

Second Scenario. A man breaks into a couple's apartment. He uses a baseball bat to stun and then kill the man of the couple. He then forces himself upon the woman. He lingers in the apartment for the remainder of the afternoon, eating the couple's food and eventually stealing the woman's jewelry.

Third Scenario. Around a half million years ago, before the beginning of the human turn towards behavioral modernity, a lone male hominin enters the territory of a hominin tribe. He surprises an isolated couple and uses a rock to stun and then kill the male. He then forces himself upon the female. He searches the outskirts of the territory and helps himself to any stashes of food he finds.

From a factual standpoint, these three scenarios appear to be almost entirely identical, and yet from an ethical standpoint, they seem to be considerably different. The second scenario crystallizes the ethics potentially at stake here, because its modern human setting removes any uncertainty as to whether an ethical standard can be applied. The intruder has committed murder, assault and robbery, three of the main classes of proscribed activity to be found in almost any modern ethical or criminal standard. Few people would attempt to justify the man's actions, and almost everyone would agree that if such activities were not regularly and severely punished, civilization as humans currently know it would soon be in danger of collapse. But if this second scenario is such a straightforward and obvious case of an ethical violation, why do the first and third scenarios seem more ambiguous? Is it simply the context of a modern human setting that makes the critical difference, and if this is so, what does this imply about ethics in general?

The first scenario contains no nuance—the event is immediately rewarding to the attacking animal and is no doubt a terrible experience for the victims. If there were ever a case in which an ethical precept could be applied to a wild animal, this instance would surely qualify. But that statement already betrays how easily humans can add an anthropocentric bias into the consideration of such matters. From the biological and evolutionary point of view, the attacking animal's activities are not only advantageous to itself and its genes but also might be advantageous to the entire species, possibly even essential (Gómez et al., 2021). If it were known that should certain members of the species not engage in such activities then the species would eventually weaken genetically and go extinct, would these activities still be judged as bad? Would they not instead be good? It can be reasonably argued that in nature, viability is the sole arbiter of what qualifies as "ethical," and at any rate, it never appears to be a matter of moral choice. In nature, what a wild animal should do is exactly what a wild animal is driven to do, and what a wild animal is driven to do is that which is most promising and satisfying in terms of increasing survival-and-procreative success.

So if ethical standards are not to be applied to the first scenario, a case of wild animals, and if ethical standards must be applied to the second scenario, a case of modern humans, then what is to be said about the third scenario, a case of humans just before they became modern humans? Many people today seem scarcely aware that humans were once—and not that long ago—purely animal, no different in nature and behavior than all the other wild animal species. But this fact is a critical input into understanding how humans have arrived at the circumstances they find themselves in today. Around a half million years ago, it would have made no more sense to apply an ethical standard to the interloping hominin than to apply that standard to the wild animal of the first scenario—indeed, the first and third scenarios could be different descriptions of the same event. Therefore, humans were not always an ethical creature. But this then raises the question of when did the transition take place, when did human activities become subject to ethical scrutiny? Was it around two hundred thousand years ago, when the features of behavioral modernity were still inchoate? Or was it around fifty thousand years ago, when the human turn had become more distinctive but still quite primitive? Or was it closer to ten thousand years ago, when agriculture and civilizations were on the verge of getting started? And was this transition sudden or gradual, and when did it apply effectively to the entire population? And finally, given this transitioning history, is it still reasonable to think that human ethical behavior can be attributed to human neurons and genes (Killen & Smetana, 2007)?

There are two further points to be gleaned from these three scenarios. First, the main classes of proscription to be found in modern ethical and criminal standards—for instance, murder, assault and robbery—they are clearly not arbitrarily chosen. They seem to be targeted exactly against the type of biologically self-advantageous behavior epitomized by the attacking animal of the first scenario. It is as though humans have come to realize that they have a compelling interest towards suppressing the beast within.

And second, to turn this motivation entirely around, modern humans also appear to have an inherent and nearly compulsive fascination with instances of raw ethical violation, such as outlined in the second scenario above (Binik, 2020). True crime podcasts, heist movies, rape fantasies, mob dramas on TV—there seems to be something fundamentally exciting and irresistible about the breaching of ethical laws. It is as though humans possess a deep-seated urge to unleash the beast within.

 

3. Modern Humans as Both Animal and Construct

Humanity's current circumstances are biologically unprecedented. Until now, every animal species has remained purely animal, with its behavioral characteristics fixed exclusively upon survival-and-procreative demand. In fact, so rigid and so predetermined have been animal behaviors across all species and across all time that they can be effectively summarized with nothing more than a simple phrase: eating and drinking, fighting and fleeing, mating and rearing. Hominins were like this too, for millions of years. But of late—and, evolutionarily speaking, over an extremely short period of time—humans have undergone a stunning transformation (Henshilwood & Marean, 2003). While retaining every animal characteristic with which they were originally endowed, humans have added an impressively broad range of behaviors heretofore unwitnessed and unexperienced upon the planet Earth—for instance, language, experimentation, art, and of course ethics.

The conventional explanation for the human transformation centers around the concept of evolution (Brown et al., 2011). For instance, maybe it was the emergence of a language gene or a significant neural alteration (Neubauer et al., 2018) that launched humans onto their more modern path. But these evolution-inspired explanations face some serious challenges. For one, they lack specificity. No identified language gene or detailed neural alteration has ever been put forth with any consistency or cogency, and even if they were put forth, no comprehensive or compelling description has ever been offered that would connect particular genes and neurons to actual human behaviors (Schaffner, 2016). The purveyors of such theories certainly have faith that such details will eventually be uncovered (Goetz & Shackelford, 2006), but they do not currently possess any direct evidence.

And perhaps even more troubling is the fact that evolutionary explanations of the human transformation must somehow be made to fit within an extremely narrow timeline. The beginning of the turn towards human behavioral modernity appears not to have taken place before more than a few hundred thousand years ago, and its initial impact must have been minimal for quite some time. By fifty thousand years ago, although the evidence of the human turn was now unmistakable—control of fire, structured tools and weapons, cave paintings, etc.—human behavior was still quite primitive, resembling hardly at all any of the modern forms of human experience (Christian, 2018). Indeed, the majority of the current behaviors arising from the human transformation—for instance, driving, flying, long-distance communication, effective surgeries, etc.—these all have appeared only within the last century or two, suggesting that the human behavioral transformation is still an ongoing and accelerating process. These rapid and accretive behavioral dynamics would be difficult to explain with just language genes and neural alterations—the dynamics do not fit to the usually slow-moving contours of a biological and evolutionary process.

Since there is little actual evidence to indicate that modern humans have undergone any kind of significant physical or biological change—including genetic or neurological change—it would seem a more effective approach to explaining the human transformation would be to concentrate on those human features that have changed. Since the beginning of the turn towards behavioral modernity, there have been two major categories of human transition for which there is now an overflowing abundance of observable evidence. The first category of indisputable human change is the amount of artificial construction that has been accruing within the human environment. Before the turn towards behavioral modernity, humans—just like all pure animals—lived within an entirely natural setting. But at the beginning of the transformation, several unusual artifacts started making a more or less permanent appearance within the human environment: structured tools and weapons, fire pits, animal skin clothing, ornamental jewelry, abstract gestures and sounds. And as the turn towards behavioral modernity progressed, the amount and types of these constructed artifacts continued to increase at an accelerating pace. Around ten thousand years ago, with the advent of agriculture and civilizations, humans experienced a massive surge in this reconstruction of their surrounding environment: houses, roads, ships, papyrus scrolls, gigantic monuments, etc. And around four hundred years ago, with the widespread introduction of scientific and industrial techniques, humans experienced yet one more leap in this rebuilding of their experienced world: trains, factories, skyscrapers, computers, and so much more. So pervasive has been the artificial reconstruction of the human environment that today nearly every human lives in a setting in which nature has been mostly, if not entirely, eclipsed from view.

The second category of indisputable human change is the enormous number of novel behaviors that have been engendered in direct response to this artificial reconstruction of the human environment. Every alteration to the human surroundings provokes a corresponding change in human behavior. Clothing alters where humans migrate and live (Gilligan, 2010), controlled fire alters what humans eat (Scott et al., 2016), structured weapons alter what humans hunt (Ben-Dor & Barkai, 2023), and so on. And in the modern world, the catalog of human behaviors developed in direct response to the environment's many constructed artifacts has become so extensive and so all-encompassing as to be almost overlooked: humans drive because there are cars on the street, humans read because there are books on the shelf, humans shave because there are razors in the cabinet, etc. Human behavioral modernity did not arise within a vacuum, it arose instead in direct response to those many artificial constructions now saturating the human environment.

Nonetheless, and quite remarkably, in the midst of all this artificial reconstruction and its resulting behavioral novelty, humans have also retained the entirety of their former animal nature. Humans must still eat and drink, humans must still avoid danger, and humans must still procreate and rear their young. Humans have retained their communal instincts and still give evidence of their tendency towards gregariousness, with many of society's current activities and operations hearkening back to a more kindred time. Soap operas, org charts, crosstown sports rivalries—if one knows how to look carefully, one can still see the contours of a more tribal existence. And although humans are no longer raised to be self-sufficient hunter-gatherers within the natural surroundings of the African plains, humans still possess all the biological characteristics to do so. Humans carry with them today the same animal traits as they did several hundred thousand years ago.

Therefore, to characterize the human turn towards behavioral modernity, it is necessary to bridge the gap from humans as pure animal to humans as still animal but no longer purely so. A way to accomplish this feat would be to depict modern humans with the phrase animal and construct, a phrase meant to highlight the dual source of modern human behavior. The word animal of course needs no further justification. The word construct is being used to denote, as outlined above, the two categories of indisputable human change:

  1. The artificial reconstruction of the human environment; and,
  2. The novel behaviors resulting from that artificial reconstruction.

The word construct and its two-category meaning emphasizes how the newer aspects of humanity have been built into the species, forged tangibly into the human environment and fashioned perceptibly into human behavior. Thus, it is not really necessary to search for these new characteristics inside human neurons and genes, because these new characteristics can be observed directly right before one's very eyes. Furthermore, the word construct captures precisely the total amount of change that has been layered on to the species over the course of the human transformation, for if one were to remove every artificial feature that now exists within the human environment, and if one were to suppress every human behavior that can trace its origin back to those removed artifacts, all that would then remain would be the biological and evolutionary organism that once used to define Homo sapiens. All that would then remain would be the pure animal humans once used to be.

 

4. The Inherent Conflict

The dual-origined nature of modern humanity—animal on the one hand and construct on the other—gives rise to an inevitable tension. These two aspects differ greatly in their history, in their relationship with space and time, in their motivations, and in their ultimate goal. The animal aspect of humanity tends to pull the species backwards in time, towards the natural days of pure survival and procreation. The constructed aspect of humanity tends to push the species in a new direction, towards greater creation and towards a purpose that remains mostly unknown. This push-and-pull battle impacts the entire population and gives birth to ethical conflict, the species caught between the demands of its two competing interests. And this push-and-pull battle impacts each individual, now with the freedom of moral choice but also with no clear indication as to which influence is to be given the greater authority—the animal instincts within or the structured conditions all around.

It can be difficult to remember, amidst all the artificial construction humans find themselves immersed within today, that a person's most fundamental and deep-rooted nature is still that of a biological creature (Winston, 2003). And yet, humans are born, humans die, humans delight in their sexual congress, humans nurture their children towards adulthood, humans suffer through fear and pain, and humans experience every event of their entire existence in the immediacy of the here and now, just as was the case on the African plains several hundred thousand years ago. The most pressing of human needs are still those which are self-preserving, and the next most pressing of needs are those associated with family, betraying the continuing genetic favoritism of human evolutionary drive. Most humans still desire the comforts of close communal belonging, and many still cling to the security associated with tribal hierarchy. And although humans have learned they can suppress and assuage such needs and interests in favor of alternative goals, humans seldom do so with a feeling of unmitigated joy. Humans can sense instinctively that there is a sacrifice involved with taking the constructed path, the sacrifice of denying one's more natural wants and needs. The question is always lingering in the air: is the sacrificial benefit worth the cost? An observation of modern human behavior, in which the breaking of the rules is celebrated almost as frequently as the following of the rules (Morrall et al., 2018), would suggest the answer is still frequently no.

Therefore, the constructed aspect of humanity faces a daunting task. Having arisen from nothing and needing to build an expanding foothold onto the human scene, the constructed aspect of humanity must convince its subjects to forgo their immediate desires in favor of a promise for something better later on. Admittedly, artificial construction has frequently been able to deliver on this promise. From animal skin clothing and structured tools and weapons to the immense power of modern medicines and electricity, the built-up innovations of humankind have benefited the species to such an extent that there are now eight billion people living on the planet. But each new promise and each new construction requires a mastery of, and a patience with, time and space, a nod towards delayed gratification over more immediate alternatives. Not every human is willing to wait that long, and not every human foresees the personal benefit behind the promise. Human change is made in the face of a constant resistance, the resistance against doing what one is not naturally inclined to do.

It is to overcome this resistance that ethical precepts are formed. An ethical precept is much like other human-built artifacts—similar to language, to music, to agriculture, and to all the rest. But an ethical precept differs in this one important respect: it does not of itself serve any directly constructive purpose, it is instead meta-constructive, it makes room for other constructions to take place. An ethical precept accomplishes this task by confronting a stubborn obstacle, by cajoling, threatening, shaming, and otherwise convincing humans into giving up some aspect of their animal nature. A later reward over immediate pleasure. Civility as opposed to conquest. Cooperation instead of appropriation. Humanity's animal nature must be subdued in this manner because it is fundamentally opposed to humanity's more artificial alternative. Animal nature is often destructive instead of constructive. Animal nature is concerned only with the immediacy of the here and now, never with the expansiveness of time and space. Animal nature is motivated by the particular, the individual, the concrete, the familial, and remains oblivious to the abstract, the symmetrical, the numerical, the universal. Almost every concept upon which artificial construction can thrive is contravened by humanity's instincts, and thus there can be no human transformation without significant abeyance of this deep-seated bestial drive.

At the beginning of the turn towards behavioral modernity, humanity's animal aspect would have been dominant, with only a few sporadic instances of artificial construction to be found anywhere within the human surroundings, generating only the barest of need for any form of non-biological proscription. By around fifty thousand years ago, at the beginning of the last migration out of Africa, humanity's environment would have found itself more cluttered with newly developed artifacts—clothing, spears, hooks, jewelry, body painting, abstract gestures and sounds—with the impact of these artifacts nudging human activity onto alternative paths, creating a greater requirement for interactive structure and corporeal restraint, even if the balance at that time still stood in favor of the more primitive. By around ten thousand years ago, with the development of agriculture, permanent abodes, methods of transportation, and larger communities, the parity between animal and construct would have been shifting rapidly towards the latter, resulting in more multiplicity in human behavior and creating a burgeoning need to restrict instinctive conduct, leading to codified bodies of law, formalized means of enforcement, and more frequent entreaties towards habits of self control.

Thus, as the human turn towards behavioral modernity has progressed, and as the amount of artificial construction within the human environment has continued to accrue, and as the influence of that construction upon human behavior has become more impactful, the need for ethical machinery has grown ever more intense. Ethical precepts have been combined into ethical systems, ethical systems have sought for justification (deity, rationality, utilitarian principles, etc.) (Griffiths, 1957), justification has brought stricter prosecution from the human surroundings. Reflecting the complexity of modern human circumstances, the ethical and moral systems of today are comprehensive, intricate, filled with nuance, and sometimes even contradictory (Francot, 2014), but at their core, all ethical systems still state the same basic tenet: humans must in some respect suppress the immediacy of their animal instincts in favor of more expansive, more distant, and more artificial goals. And at their perimeter, all ethical systems still encounter the same rudimentary defiance, the deep-seated human unwillingness to let go of the species' biological prerogative. Fundamentally, an ethical struggle is not a battle between good and bad, not a decision between right and wrong. Fundamentally, an ethical struggle is the expression of the inherent human conflict between animal and construct.

 

5. Consequences

Whereas the animal aspect of humanity would have been dominant at the beginning of the human turn towards behavioral modernity, today the circumstances have nearly reversed. Most humans today live in settings, such as large modern cities, in which nature has been almost entirely eclipsed from view, replaced everywhere by an assembled infrastructure that has become staggering in the degree of its depth and breadth (Guidotti, 2015). Human behavior, guided at every turn by the environment's many constructed artifacts, resembles hardly at all that of the other animal species, and resembles hardly at all that of hominins from a few hundred thousand years ago. Even the most elemental of human events—eating, drinking, sex, childbirth—these are accomplished today with the support of an entire host of artificial accoutrements—grocery stores, plumbing, contraception, anesthesia. And if modern humans find they must occasionally give vent to their animal essence, they can usually do so indirectly, through an assortment of vicarious, sublimated and assisted means—sports, beauty pageants, social media, pornography, alcohol, etc. Humans today expend as much effort assuaging the beast within as they do expressing the beast within; indeed, most people today fail to recognize that they are beasts at all.

Because of this near dominance of humanity's constructed aspect, and reflecting that aspect's ongoing effort to maintain a tight control over a large and potentially unruly animal population, ethics today is almost always presented as a one-sided argument. The conflict between animal and construct is framed as a battle of evil versus good, wrong opposed to right, devil contra savior, with these pronouncements backed by an assortment of doctrinal and rational justifications, such as the Decalogue, Kant's moral imperative, and utilitarian formulas. These days, to label someone as an animal is to effectively insult them, to describe someone as renegade is to attempt to shame them, and to cast someone as self-serving is to place them under the deepest of suspicion. Humans today expend as much effort burying the beast within as they do expressing the beast within; indeed, most people today refuse to admit that they are beasts at all.

Nietzsche's insight was to recognize the potentially debilitating impact of this stifling dynamic, arguing that the wholesale suppression of humanity's ingrained animal nature removes too much vitality from the quest towards human progress, and creates so much pent-up longing for zoic release that it manifests in unhealthy and unproductive ways. Despite their oppositional differences, humanity's animal and constructed aspects have managed to share a mutually supportive relationship, a relationship held together mainly by the species' biological impetus towards self-preservation and self-advantage. For instance, most of the constructed artifacts added over the years to the human surroundings have been targeted explicitly towards increasing the survival-and-procreative success of Homo sapiens and towards easing the more burdensome challenges of a biological existence, and it is the recognition and appreciation of these ecological benefits that motivates many humans to make the necessary sacrifices to give artificial construction an opportunity to grow, a motivation far more effective than any logical or theological justification. At the same time, it is often through an individual's desire for selfish gain that he or she will craft the next invention, formulate a novel idea, or build the newest towering structure (Weitzel et al., 2010). How many innovative projects have been launched by the egoistic actions of some person in search of greater power, wider fame and more lavish riches, and how many of these self-centered attempts have resulted in the advancement of circumstances for the population as a whole?

In humanity's better and more productive moments, there has always been a balance, a degree of equilibrium, between the animal and the constructed aspects of the species, with each aspect contributing its particular form of benefit to the cause of transformation. A complete dominance by either aspect would be of doubtful merit. For instance, a complete dominance by animality—such as might easily be experienced in civilization collapse—would mean at best a return to the species' former biological regimen, confining humans to the harsh and static realities of a survival-and-procreative existence, forgoing whatever unique opportunities and potential destiny behavioral modernity might have happened to bring. Similarly, a complete dominance by artificiality—conceivable these days with the advent of genetic engineering, robotics, artificial intelligence, and the like—would mean an absence of vitality in the shaping of future events, leading perhaps to an entirely fabricated existence, one that could easily turn out to be mechanical, predictable, stale, cold.

Foreshadowing these potential outcomes for the species as a whole are the consequences experienced today by the species' individual members, who find themselves confronted on an ongoing basis by these same animal-versus-construct choices. And for those individuals whose concerns reach no further than the contingencies of the present moment, and who seek no advantage beyond that which can be gained out of immediate circumstances, and who find their motivations only in what is self-serving and self-preserving, they run the danger of forging an experience that is narrow, calamitous, nasty, and Darwinian. And for those individuals whose concerns look only towards the promise of a distant future, and who seek no activity beyond that which can be described as righteously ascetic, and who have their motivations in the conformity underlying every widely proclaimed rule, they run the danger of forging an experience that is rigid, stagnant, joyless, and unnatural. The task of modern humanity is to traverse a precarious course between the two abysses of animal and construct, with the immediate goal to keep from falling to either side. The ultimate goal—the ultimate human goal—is to transcend the inherent conflict between the two.

 

References

Ben-Dor, Miki & Barkai, Ran. (2023). The Evolution of Paleolithic Hunting Weapons: A Response to Declining Prey Size. Quaternary. 6. 46. https://doi.org/10.3390/quat6030046 .

Binik, O. (2020). The fascination with violence in contemporary society. Springer International Publishing.

Brown, G. R., Dickins, T. E., Sear, R., & Laland, K. N. (2011). Evolutionary accounts of human behavioural diversity. Philosophical transactions of the Royal Society of London. Series B, Biological sciences, 366(1563), 313-324. https://doi.org/10.1098/rstb.2010.0267

Christian, D. (2018). Origin story: a big history of everything. First edition. New York, Little, Brown and Company.

Francot, L. M. A. (2014). Dealing with complexity, facing uncertainty: Morality and ethics in a complex society. Archiv für Rechts- und Sozialphilosophie 100 (2):201-218. https://www.jstor.org/stable/24756800

Gilligan, I. (2010). The Prehistoric Development of Clothing: Archaeological Implications of a Thermal Model. Journal of Archaeological Method and Theory, 17(1), 15-80. https://www.jstor.org/stable/25653129

Goetz, A. T., & Shackelford, T. K. (2006). Modern Application of Evolutionary Theory to Psychology: Key Concepts and Clarifications. The American Journal of Psychology, 119(4), 567-584. https://doi.org/10.2307/20445364

Gómez, J.M., Verdú, M., & González-MegĂ­as, A. (2021). Killing conspecific adults in mammals. Proceedings of the Royal Society B, 288.

Griffiths, A. P. (1957). Justifying Moral Principles. Proceedings of the Aristotelian Society, 58, 103-124. https://www.jstor.org/stable/4544591

Guidotti, T. L. (2015). Artificial Ecosystems, Health and Sustainability: An Introduction (New York, 2015; online edn, Oxford Academic, 23 Apr. 2015). https://doi.org/10.1093/acprof:oso/9780199325337.003.0009.

Henshilwood, C. S., & Marean, C. W. (2003). The origin of modern human behavior. Current anthropology, 44(5), 627-651. https://doi.org/10.1086/377665

Killen, M., & Smetana, J. (2007). The biology of morality: Human development and moral neuroscience [Editorial]. Human Development, 50(5), 241-243. https://doi.org/10.1159/000106413

Klein, R. (2002). The Dawn of Human Culture. New York: Wiley.

Klein, R. G. (2009). The human career: Human biological and cultural origins. University of Chicago Press.

Morrall P., Worton K., & Antony D. (2018). Why is murder fascinating and why does it matter to mental health professionals? Mental Health Practice. https://doi.org/10.7748/mhp.2018.e1249

Neubauer, S., Hublin, J. J., & Gunz, P. (2018). The evolution of modern human brain shape. Science advances, 4(1), eaao5961. https://doi.org/10.1126/sciadv.aao5961

Nietzsche, F. (2013). On the genealogy of morals. (R. C. Holub, Ed.; M. A. Scarpitti, Trans.). Penguin Classics.

Pinker, S. (1994). The language instinct. William Morrow & Co.

Quiros, F. (2012). The origin of ethics. Human Evolution. 15. 149-155. https://doi.org/10.1007/BF02436243

Schaffner, K. F. (2016). Behaving: What's genetic, what's not, and why should we care? Oxford University Press.

Schlinger, H.D. (1996) What's Wrong With Evolutionary Explanations of Human Behavior. Behav. Soc. Iss. 6, 35-54. https://doi.org/10.5210/bsi.v6i1.279

Scott, A. C., Chaloner, W. G., Belcher, C. M., & Roos, C. I. (2016). The interaction of fire and mankind: Introduction. Philosophical transactions of the Royal Society of London. Series B, Biological sciences, 371(1696), 20150162. https://doi.org/10.1098/rstb.2015.0162

Weitzel, U., Urbig, D., Desai, S., Sanders, M., & Acs, Z. (2010). The good, the bad, and the talented: Entrepreneurial talent and selfish behavior. Journal of Economic Behavior & Organization, 76(1), 64-81. https://doi.org/10.1016/j.jebo.2010.02.013

Winston, R. (2003). Human instinct: How our primeval impulses shape our modern lives. Bantam Press.

Zwir, I., Del-Val, C., Hintsanen, M., Cloninger, K. M., Romero-Zaliz, R., Mesa, A., Arnedo, J., Salas, R., Poblete, G. F., Raitoharju, E., Raitakari, O., Keltikangas-Järvinen, L., de Erausquin, G. A., Tattersall, I., Lehtimäki, T., & Cloninger, C. R. (2022). Evolution of genetic networks for human creativity. Molecular psychiatry, 27(1), 354-376. https://doi.org/10.1038/s41380-021-01097-y

Tuesday, March 5, 2024

Scientific Revolutions, Abductive Reasoning, and Autism

1. Introduction

Thomas Kuhn's 1962 book The Structure of Scientific Revolutions (Kuhn, 1962) is an interesting example of a self-referencing idea. In the work, Kuhn outlines a description of how scientific frameworks tend to transform over time through a roughly cyclical pattern of paradigm, anomaly, crisis, and then paradigm shift—in essence, through a series of stasis-breaking challenges and radical reformulations. This description runs counter to the then prevailing view that science progresses in a more incremental and accretive fashion, using the tools of verifiability and falsifiability to nudge the scientific community towards consensus in the face of new and/or competing theories. Kuhn's work has received its share of criticism over the years (Masterman, 1970; Sanbonmatsu & Sanbonmatsu, 2017), but there is no questioning that the book has had a profound influence on the history and philosophy of science, its themes now deeply ingrained into the mindsets of both practicing scientists and the general public as they survey how human knowledge has unfolded during the past and continues to develop through the present day (Kaiser, 2012). Which is to say, The Structure of Scientific Revolutions has itself produced a meaningful and persistent paradigm shift.

Kuhn's template for scientific revolution is similar in many respects to the concept known as abductive reasoning. Abductive reasoning was brought into prominence by the nineteenth-century American philosopher Charles Sanders Peirce, who explored the topic frequently throughout his copious writings on logic, scientific classification, semiotics and pragmatism (Peirce, 1992, 1998). Peirce himself sometimes struggled to nail down the exact nature of abductive reasoning, admitting at one point that he had perhaps confused some of its characteristics with those of inductive reasoning during the earlier stages of his career. But Peirce was also the one who crafted, in typical Peircean fashion, the incisive and pithy formula by which abductive reasoning is still commonly articulated today:

The surprising fact, C, is observed.
But if A were true, C would be a matter of course.
Hence, there is reason to suspect that A is true.

Abductive reasoning can be applied against a broad range of circumstances, from personal events to scientific revolutions, but it was for the latter type of application that Peirce stressed the immense importance of abductive reasoning, noting it was the only form of logic by which humans discover and develop anything new.

A distinctive and somewhat enigmatic feature of both scientific revolutions and abductive reasoning is the aha moment, that sudden perception of an effective solution to what had been previously a vexing problem. Kuhn and Peirce say only a little about this epiphanous event, with Kuhn likening it to a change in Gestalt—such as the drawing that transitions suddenly from duck to rabbit—and with Peirce describing the insight as coming to us "like a flash" and as "putting together what we had never before dreamed of putting together." Implicit in these brief portrayals is a corollary also evident from the history of scientific revolutions, namely that these aha moments are exclusively the product of individuals, and never of groups.

This essay will explore the role autism plays in both scientific revolution and abductive reasoning, including the spawning of these aha moments. Autism is usually regarded as a medical condition (Hodges et al., 2020), but here an alternative approach will be given extensive consideration, with an emphasis on how the biological, behavioral and sensory characteristics of autism naturally give rise to an atypical form of human perception. It will be demonstrated that it is this atypical form of perception that catalyzes the abductive reasoning underlying knowledge innovation, and as partial evidence for these assertions, it will be discussed how surprisingly often autistic characteristics have made a prominent appearance in the history of scientific revolutions.

Finally, the paradigm under which the scientific community currently operates will be examined with a critical eye. Kuhn's 1962 work can be seen as being highly influenced by the circumstances of the scientific community of that time, and because of this influence, Kuhn misapprehends the state of science as it existed before the beginning of the twentieth century, and also fails to anticipate the predicament into which science would fall by the end of the twentieth century. As scientific work has become more popular, more collaborative and more financially rewarding—and less frequently the domain of unusual and isolated individuals—the scientific community has found itself becoming increasingly stuck inside the regimen Kuhn labels as normal science. In the present day, normal science is producing a particularly deleterious effect, it is systematically suppressing the revolutionary impact of atypical autistic perception.


2. Scientific Revolutions

Kuhn is best known for his introduction of the concept of paradigm shift, but paradigm shift is only one aspect—and often too narrowly understood—of Kuhn's more encompassing description of a cycle of stasis and upheaval underlying historical scientific change. The word paradigm for Kuhn is a convenient label for the circumstances of a mostly stable and generally agreed-upon scientific practice, as embodied by the scientific community in the form of textbooks, journals, conferences, constructive collaboration, and so on. During this period of what Kuhn calls normal science, the scientific community's efforts are directed almost entirely towards the confirmation and shoring up of the sanctioned framework, with little to no endeavor directed towards overthrow. What eventually disturbs a paradigm is the accumulation and/or significance of anomalies, problems that stubbornly defy all effort to be resolved within the context of prevailing theories. These anomalies foment a state of crisis within the community, with the crisis prone to being answered by the introduction of an entirely new framework, one often incommensurable with the old way of seeing things. If this new framework proves to be effective at both resolving the anomalies and clearing the landscape for future progress, the scientific community will gradually abandon the old framework and adopt the new, establishing the next paradigm for ongoing scientific practice. Thus, paradigm shift can be seen as having two different but related meanings. One, paradigm shift can refer to the adoption of the new paradigm over the old one, a process that is often slow moving and happens under the reluctant sway of the scientific community. And two, paradigm shift can refer to the insightful perception of a new and effective framework, an event that can occur suddenly and remains the province of just one individual.

Perhaps the quintessential example of these concepts is Einstein's introduction of special relativity (Einstein, 1905). The prevailing paradigm leading up to that occasion was still mostly that of Newtonian mechanics, buttressed by additional features to accommodate Maxwell's already anomalous field theory of electromagnetic waves. One of these additional features was the luminiferous ether, the hypothesized medium through which light, electricity and magnetism could propagate, but efforts to detect motion through this ether, including the famous Michelson-Morley experiment (Michelson & Morley, 1887), had instead produced an incongruous result, namely that the speed of light remained the same in every direction measured, no matter the velocity of the source. Several attempts were made—for instance, by Lorentz and Poincaré (Lorentz, 1904; Poincaré, 1900)—to reconcile this outcome to the prevailing framework, but because these efforts still clung to the existing paradigm, they failed to provide the necessary clarification. That task fell to the young Einstein, still a patent office clerk, who after several years of grappling with the problem, found sudden inspiration in the early summer of 1905 and completed his famous paper on the electrodynamics of moving bodies in a mere matter of weeks. That paper did not cling to the existing paradigm but instead boldly defied it, proclaiming the ether to be superfluous and postulating an entirely new conception of space, time, matter and energy.

As is often the case, Einstein's revolutionary ideas, despite resolving the anomalies concisely and clearing the ground for future progress, did not meet with immediate acceptance from the scientific community; nearly two decades would pass before relativity became firmly established as the basis for the next paradigm (Goldberg, 1970). Many of Kuhn's other examples of scientific revolution follow a similar course: Copernicus's heliocentric model of cosmology, Newton's laws of motion and gravity, Dalton's atomic theory of chemistry, Darwin's description of natural selection—all these innovations were the inspiration of an individual, and all were met with initial resistance by the larger group (Barber, 1961). There exists an inherent tension in each case of scientific revolution, the tension between the scientific community's intrinsic adherence to the familiar way of seeing things versus an individual's disruptive introduction of an atypical counter perception (Kuhn, 1978).

Kuhn applies his ideas almost exclusively to the domain of the natural sciences, but in a broader sense, science is simply a term for the pursuit of greater understanding, and thus Kuhn's scheme can just as effectively be applied to knowledge acquisition in general. The first control of fire, the first use of abstract language, the first mathematical concept—these moments are lost to prehistory, but there is no reason to expect they were not the inspiration of uncommon individuals, and were met with initial resistance by the guardians of the then current conventional wisdom. This pattern of human knowledge advancement, accretive in its totality but reconstructive at its core, is in many respects the primary distinguishing feature of the modern form of the human species (Griswold, 2023a). Ever since the turn towards behavioral modernity, humans have been increasingly distancing themselves from their purely animal past by reassessing and reconstructing their surrounding environment, and this activity has not been accomplished in a sociable, gradual and piecemeal fashion, but instead has been accomplished via dissension and upheaval, via the constant tearing down of the old paradigm and the rebuilding of the new. The great scientific discoveries of the last several centuries are simply recent examples of what has actually been a long-running human process, a process that, not coincidentally, is both unprecedented within the biological kingdom and is also powered by the fuel of atypical perception.


3. Abductive Reasoning

Over the past decade or so, abductive reasoning has experienced a surge in scholarly interest, so much so that the topic has become something of an academic cottage industry: classifications of abductive patterns (Park, 2015; Schurz, 2008), competing analyses of underlying logical schemas (Lycke, 2012; Urbański, 2022), endless battles over whether inference to the best explanation is the same thing as abduction (Campos, 2011; Mcauliffe, 2015), etc. To sidestep some of this noise, the focus here will remain on Peirce's original three-line formula, with an italicized emphasis on those phrases that appear to be the most under-appreciated within the academic community:

The surprising fact, C, is observed.
But if A were true, C would be a matter of course.
Hence, there is reason to suspect that A is true.

The observed fact needs to be surprising because abduction begins when something appears to be amiss or inadequate with the contextual framework. New facts, or facts that can be easily assimilated to what is already well understood, do not stimulate the kind of perturbation that comes with abduction—a surprising fact is provocative, a soon-to-be-explained fact is not. Furthermore, the real sting in Peirce's formula is in the transformation C undergoes from being a surprising fact to being a matter of course. That is no small leap. If C is originally a surprising fact—indicating trouble with the contextual framework—then almost by necessity the fact transitions to being a matter of course only via a radical change to the contextual framework, a change sometimes so sweeping as to render the new framework incommensurable with the old. Contextual frameworks can run the gamut from personal worldviews to the shared paradigms of the natural sciences, but in each instance the framework's purpose is to provide clarification and orientation, and when it fails to do so, it needs to be discarded and rebuilt anew. Thus, the A of Peirce's formula is often much more than just an explanatory hypothesis, the A of Peirce's formula is what people now commonly call a paradigm shift.

Let us consider some examples. The first example is the already mentioned introduction of special relativity. Just about any instance of scientific revolution could serve as illustration for abductive reasoning—special relativity happens to be particularly thematic. There were two major anomalies, or surprising facts, that provoked Einstein's scrutiny. One, there was the unexpectedness of the Michelson-Morley result, doggedly indicating no detectable motion through the luminiferous ether. And two, no one, including Einstein himself, seemed to be able to adjust Maxwell's electromagnetic equations to make them conform to the Galilean relativity principle (Earman et al., 1982). Einstein's solution to these challenges, simple in conception but monumental in its consequence, did indeed transform both of these anomalies into a matter of course. The first anomaly was resolved by raising the constancy of the speed of light in every inertial frame to the level of postulate, rendering the Michelson-Morley outcome straightforward and trivial. This also cleared up the second anomaly, by allowing Einstein to demonstrate that his inability to make Maxwell's equations conform to the relativity principle was ironically correct, because in fact no adjustment was needed, the equations already conformed as they were.

Here, the A of Peirce's formula was nothing short of the overthrow of the contextual framework of physics, a complete reconceptualization of space, time, matter and energy. What was gained by this disruption was clarification, a clearing of what had been previously a problematic landscape, a reorientation allowing scientists to proceed. Compare this outcome to the approach taken by Hendrik Lorentz. Lorentz, prior to Einstein, had already developed much of the mathematics describing relativity, but had done so through a strained effort to accommodate the perceived anomalies to the prevailing Newtonian/Maxwellian framework, and the strain shows. Time dilation for Lorentz was in essence a mathematical trick, a kludge to force the equations to conform to the relativity principle. And length contraction was a mysterious property imposed upon moving bodies by the luminiferous ether, calibrated precisely to the Michelson-Morley result. These interpretations, even if they were true, would not provide clarification, but would instead simply shift the venue of the anomalies. A mathematical trick that seems to work with time is itself anomalous; compression of moving bodies by a massless ether is itself a surprising fact. Abduction—especially ampliative abduction, the kind that produces new understanding—is less about the search for plausible hypotheses than it is about the quest for clarification. Both Einstein and Lorentz had offered plausible hypotheses, but Einstein's paradigm shift produced clarification, Lorentz's strained fit to the old paradigm did not.

To take a more everyday example, consider the following scenario. A man wakes up on Friday morning, showers, dresses for work, has breakfast, then walks to the bus stop and waits for the 8:30 bus. But the bus does not arrive. The man is perplexed—this has never happened before, and he begins to get a vague sense that something is wrong. Maybe the bus has broken down, he thinks, and he will need to find an alternative means to get to work, but nothing about that explanation, even if it were true, seems satisfying to him. Then suddenly it hits him—today is not Friday, today is Saturday! Of course the bus has not arrived! The man also now recognizes the source of his vague sense that something was wrong—no one else is at the bus stop and there is less traffic on the road. Everything has become clear to him now and he walks home to begin his Saturday chores.

The surprising fact in this scenario is that the bus does not arrive, and as is often the case, many explanations can be offered to account for this surprising fact. But explanations are not the goal here, clarification is the goal. The hypothesis that the bus has broken down is perfectly reasonable, probably even the most likely, but it does not do anything to clarify this man's situation, in fact it leaves it more messy than before. Will the bus service send a back up? Should he call for a taxi? Do taxes need to be raised in this city to promote better vehicle maintenance, etc.? Of course, reality is often like that, the facts do turn out to be messy sometimes, and humans must learn to deal with those situations too. But contextual frameworks do not have the luxury of being messy—their sole purpose is to provide clarification and orientation, and when one can successfully make use of them, they are the most advantageous of tools. Thus, when the man suddenly realizes that today is actually Saturday and not Friday—that is, when he swaps out one contextual framework for another—his world transitions immediately from being problematic to being crystal clear. He knows how to proceed because he has been afforded the gift of a useful abduction.

As a final example, let us consider a present day anomaly that appears to be in need of a scientific revolution—the Flynn effect. It was early in the twentieth century when IQ exams were first created and administered, and as that century progressed, it was observed that the raw scores on these exams were significantly increasing over time (Pietschnig & Voracek, 2015; Trahan et al., 2014). James Flynn in the 1980s documented, with large amounts of data, that this phenomenon was essentially universal, and the phenomenon soon thereafter would be dubbed the Flynn effect (Flynn, 1984, 1987). The prevailing paradigm regarding human intelligence is that it is a product of the human brain—that is, somewhere within the cerebral mesh of neurons, synapses and biochemical activity, the mechanisms of intelligence make their biological home (Jung & Haier, 2007). But given this contextual framework for intelligence, the Flynn effect emerges as a surprising fact. Evolutionary principles generally preclude such a rapid and population-wide improvement in a biological capacity—the expected outcome is that the average level of human intelligence would remain stable over time.

There have been countless explanations offered for the Flynn effect. For instance, it has been suggested that such factors as better nutrition (Lynn, 1989), greater access to formal education (Baker et al., 2015), increased exposure to video games and puzzles (Clark et al., 2016), etc.—or various combinations of the above (Jensen, 1998)—have contributed to an overall increase in the efficiency of human brains. In addition, several comprehensive models have been proposed hypothesizing a combined genetic and ecological causality for changing levels of human intelligence, intricate formulations such as the Dickens-Flynn model (Dickens & Flynn, 2001) and Woodley's theory of fast and slow life (Woodley, 2012). These explanations all have two characteristics in common. One, each explanation adheres to the prevailing paradigm of a brain-centric mechanism for human intelligence, casting its solution as impactful upon the effectiveness of the human brain. And two, each explanation, even if it were true, would provide little in the way of clarification. For instance, it would remain entirely unspecified how better nutrition, greater access to formal education, or increased exposure to video games and puzzles would induce the type of intense biological and neurological impact required to boost intelligence scores universally. And formulations such as the Dickens-Flynn model and Woodley's theory of fast and slow life are themselves more labyrinthine and more undetermined than the anomaly they are meant to explain (contrast these formulations, for instance, to Einstein's two-postulate model of relativity).

The odd thing is, the current situation regarding the Flynn effect would seem to provide the ideal backdrop for a Kuhnian crisis, and yet the intelligence research community shows no indication of being flummoxed at all. Its relentless adherence to the existing paradigm and its continuing pursuit of non-clarifying hypotheses suggest this community will remain in its current state for quite some time, and this raises a further question of whether something about Kuhn's description of scientific revolution has itself become anomalous in the twenty-first century (more on this topic later). Nonetheless, whether the scientific community is aware of this crisis or not, abductive reasoning would indicate that the most promising path forward with regard to the Flynn effect would be to transform the contextual framework, to shift the prevailing paradigm, to reconceptualize human intelligence (Griswold, 2023b).

The first two examples—special relativity and the non-arriving bus—each contain an aha moment: in his later years, Einstein narrated a description of how a casual conversation on a beautiful Bern day gave him a sudden insight into the nature of his relativity problem, opening the pathway to his famous paper (Stachel, 2002), and of course in the example of the non-arriving bus, the aha moment comes with the sudden realization that the day is Saturday. These aha moments, even when connected to widely shared paradigms, are almost always personal and solitary in nature—the history of science is chock-full of such epiphanies, but they are the epiphanies of individuals, never the epiphany of an entire group. And indeed, as can be seen in the case of the Flynn effect, the scientific community is actually inclined towards the opposite of the aha moment, is inclined towards a mutual and fixed regard for the prevailing paradigm. Thus, there appear to be two types of perception at work within the human population, each antipodally aligned with respect to abductive reasoning and scientific revolutions. One type of perception is prone to being communal and conservative, inherently friendly towards conventional wisdom and the favored paradigm, and could be fairly labeled as typical perception. The other type of perception is prone to being idiosyncratic and iconoclastic, naturally distrustful of the popular perspective, and could be fairly labeled as atypical perception. Both types of perception play important and reciprocal roles in the maintenance and reconstruction of human knowledge, and there is value to be gained in understanding more fully the distinction between them. To that end, the discussion now turns to the concept known as autism.


4. Autism

Autism was first recognized and described in the mid twentieth century, particularly with the publication in the 1940s of case studies by psychiatrist Leo Kanner (Kanner, 1943) and pediatrician Hans Asperger (Asperger, 1944), studies that highlighted the defining behavioral characteristics of the autistic condition—namely, social difficulties, language peculiarities, and an intense focus on circumscribed interests. In the decades that immediately followed these publications, autism was regarded almost invariably as a dire medical condition, exceedingly rare and leading to outcomes inevitably poor (Evans, 2013). However, the current view regarding autism has changed enormously from those earlier times, with two primary developments triggering the transformation (O'Reilly, 2020). First, the prevalence of autism has turned out to be much greater than was originally assumed, increasing by orders of magnitude from initial estimates of around 1 in 2000 (0.05%) to the current estimates of around 1 in 50 (2.0%) (Ballan & Hyk, 2019). And secondly, along with this recognition of significantly greater numbers of autistic individuals has come the parallel realization that only a small percentage of their outcomes turn out to be anything resembling the word dire. In actuality, autistic outcomes constitute an extremely broad range, with indeed some individuals experiencing serious developmental difficulties and requiring lifetime assistance and care, but with many others leading lives of almost indistinguishable normalcy, and some attaining lives of exceptional achievement (Reis et al., 2022). The word spectrum is now frequently employed to depict the wide variability in both autistic presentation and autistic outcomes, and although the word is apt to be misused at times, spectrum does capture an aspect of how autism is generally regarded today.

Nonetheless, the lingering stigma from the earlier views regarding autism does continue to have some unfortunate consequence, the most troubling being the long-lasting impact upon the autism research community. That community still studies autism primarily as a medical condition, focusing nearly all of its efforts and resources on discovering causes and cures. For many decades now, autism research has been directed towards finding the genetic defect that underlies autism (Reiss et al., 1986; Rylaarsdam & Guemez-Gamboa, 2019), towards describing the neurological aberration that explains autism (Haas et al., 1996; Pan et al., 2021), and towards uncovering the environmental insult that produces autism (Cattane et al., 2020; Kern & Jones, 2006), frequently with the stated goal of eradicating, or at least ameliorating, the features of the condition. But these many decades of research have produced literally nothing in the way of results: there is no known genetic defect underlying autism, there is no known neurological aberration explaining autism, and there is no known environmental insult producing autism (Hodges et al., 2020). When it comes to advancing a medical understanding of autism, the scientific community stands no differently today than it did dozens of years ago, and indeed the verdict remains entirely open as to whether autism should be regarded as a medical condition at all.

This essay will examine an alternative description of autism, one that takes into full account the biological, behavioral and sensory characteristics that define autism, but that also looks beyond the narrow restriction of perceiving autism as just a medical condition. This alternative description of autism begins with an account of non-autism—that is to say, what it means to be biologically typical—with an emphasis on those perceptual characteristics that delineate non-autism. This includes a focus on the biological and evolutionary importance of conspecific perception, the innate tendency to perceive first and foremost the other members of one's own species (Buxton et al., 2020). Autism is then contrastingly described as a significant lack of this conspecific perception, a lack that both produces the observable characteristics of the condition and also leads directly to a compensatory and divergent form of perception. These two types of perception—non-autistic typical perception and autistic atypical perception—are then seen as producing in tandem a revolutionary impact upon the entire human species, including being the source of the typical/atypical perceptual divide that characterizes the essential tension at the core of abductive reasoning and scientific revolutions.

In outline form, this alternative description of autism can be presented as follows (a more thorough account can be found in other writings (Griswold, 2007, 2023a)):

  1. Non-autistic, or biologically typical, humans possess fully those behavioral and perceptual characteristics that have carried forward from humanity's not-so-distant animal past. Until recently in their evolutionary history, humans were still pure animals, with their behaviors and perceptions centered exclusively around survival-and-procreative demand—food, water, danger, sex, etc. (Klein, 2009). Not until the turn towards behavioral modernity, starting around a few hundred thousand years ago, did humans begin to add the other behaviors and perceptions that now distinguish the species from the remainder of the animal kingdom (Klein, 2002). Nonetheless, the influence of those animal-origined behaviors and perceptions still remains strong today. Most members of the current population, despite living in artificially constructed environments and despite having most of their biological needs easily met, continue to give a great deal of attention and effort to those familiar targets—food, water, danger, sex, etc.—and much of current human activity is still guided by a shared interest in these familiar themes.

  2. Among the carryovers from humanity's animal past, conspecific perception plays a central role in determining the social and behavioral characteristics of the population. Conspecific perception is the innate tendency to perceive first and foremost the other members of one's own species, a tendency apparent in essentially all animal species: lions perceive first and foremost other lions, honeybees perceive first and foremost other honeybees, etc. Conspecific perception foregrounds intra-species sensory experience against a less distinct sensory background, and this tendency is evolutionarily crucial for allowing mates to discover mates, parents to focus on their offspring, offspring to follow their parents, members of a pack to track one another, and so on. Conspecific perception is quite strong within the human species, as it would be for any species considered to be highly social, and it has the impact of drawing the human population together, because most humans possess a natural and shared interest in observing other humans and in mimicking what other humans do.

  3. Conspecific perception also plays a critical role in the sensory and developmental progress of human individuals. When a human child enters this world he or she must first achieve a sensory grounding, because otherwise, the sensory impressions a child experiences would remain chaotic and unorganized. As is evident from the rapt, natural and delighted attention most children give to other humans and to human activities, conspecific perception is one of the primary means by which human children attain their sensory grounding. From the manifold of impressions that arises in the sensory field, what emerges most predominantly are human sights, human voices, human smells, human activities, and so on. A human child then uses this human-forward sensory grounding to pursue further developmental progress, including first steps into the leveraging world of human language. Thus, most children today owe their perceptual and developmental start primarily to the species' shared and natural interest in all things human.

  4. Biological perception in general, and conspecific perception in particular, has the persistent impact of locking a species into a behavioral and perceptual stasis. Animal behaviors and perceptions are remarkably stable, both across species and across time. Nearly all wild animal species today live lives that are essentially the same as they were hundreds of thousands of years ago, lives similar to those of the other animal species, lives intensely focused on survival-and-procreative demand—food, water, danger, sex, etc. Even evolutionary change does not alter this pattern—the resultant species will live the same biologically and conspecifically focused life as did the predecessor species. With the turn towards behavioral modernity, the human species has clearly broken out of this rigid pattern, with its members living lives today that are much different than they were in prior times. But it is important to recognize that until quite recently in their evolutionary history, humans were just as locked into the confining consequences of biological and conspecific perception as were all the other animals, raising the question of exactly how it came to be that this lock was broken.

  5. Autism can be characterized as a significant diminution of conspecific perception. In marked contrast to biologically typical individuals, autistic individuals can be seen as displaying a diminished awareness and attention for other human beings. Young autistic children do not engage as readily or willingly with other people as non-autistic children generally do, and autistic children appear to be much less interested in observing or participating in human-related activities (Hedger & Chakrabarti, 2021). These behaviors are frequently characterized as social difficulties, but in a sense that phrase mischaracterizes the trait. The so-called social difficulties of autistic children are not the result of a particular social defect so much as they are the result of a substantial perceptual distancing from the species itself. That is to say, the social difficulties of autistic children are the most clearcut evidence of their significant lack of conspecific perception.

  6. The diminution of conspecific perception in autistic children thwarts their attainment of a sensory grounding by the typical means. The degree to which conspecific perception is diminished in autistic individuals can vary, and this may explain in part the spectrum-like nature of autistic presentation and outcomes, but the diminishment is always significant in the following sense: autistic children, unlike biologically typical children, cannot organize their sensory experience around a natural predominance of human-centric features. Almost every autistic child experiences sensory issues (Hazen et al., 2014), issues that range all the way from hypersensitivity to hyposensitivity to synesthesia, and the motleyness of these sensory symptoms suggests they are not the product of a particular physical cause so much as they are the consequence of a generalized difficulty in organizing sensory experience. From the manifold of impressions that arises in the autistic child's sensory field, human sights, human voices, human smells, etc., they do not emerge predominately from the sensory background. This leaves the autistic child without a sensory grounding, navigating what must seem to be the near equivalent of a sensory chaos, and if these circumstances are not resolved, the child can be expected to encounter nearly insurmountable developmental challenges.

  7. To attain their sensory grounding, most autistic children adopt an alternative form of perception, one that can be characterized as a heightened attention and awareness for the inherent structural features that stand out from the surrounding environment. Although usually delayed compared to their non-autistic peers, most autistic children do overcome their developmental challenges, and this developmental progress indicates that most autistic children do attain a sensory grounding, a result evidenced also by the fact their sensory issues tend to ease over time (Kern et al., 2006). But since an autistic child's overcoming of a potential sensory chaos is not achieved through the predominant influence of conspecific perception, it must be achieved by some other means. Chaos as a term denotes a lack of structure, and chaos is generally dissolved by the emergence of structural features—symmetry, repetition, pattern, number, form. Autistic children provide abundant evidence that they overcome their sensory chaos by focusing not on other people, but instead by focusing on the structural elements to be found in their surrounding environment. Ceiling fans, spinning wheels, light switches, the shapes of letters, sports statistics, dinosaur taxonomy, etc., the structure-suffused interests and activities of autistic individuals form a lengthy list. This is a core and defining characteristic of autism, and is often referenced by the phrase restricted and repetitive behaviors, a phrase that mostly misjudges the critical necessity of those behaviors. Whereas non-autistic children can gain their sensory grounding through an interest in all things human—that is, via conspecific perception—autistic children must gain their sensory grounding through an intense focus on the non-biological structural features that stand out from the surrounding environment. Thus, most autistic children today owe their developmental start primarily to an alternative form of perception.

  8. The significant presence of autistic individuals within the human population modifies the perceptual characteristics of the population as a whole, thereby breaking the stasis imposed by biological and conspecific perception. Through their repeated efforts to mirror and to reconstruct the contextual patterns they perceive, autistic individuals bring to the foreground the structural elements and structural potential to be found in the surrounding environment. Non-autistic individuals, previously blinded to these structural elements by the constrictions of biological and conspecific perception, yet keenly aware of what other humans do, begin to notice these autistically inspired patterns and behaviors, and begin to adopt them for themselves. In this fashion, the entire human species begins to perceive and to interact with the surrounding environment in a manner that goes beyond just biological and evolutionary need, thereby opening the door to behaviors unique within the animal kingdom and unprecedented over the course of biological history.

  9. The human turn towards behavioral modernity, including the revolutionary advancement in human knowledge, has been catalyzed by the ongoing symbiosis between the autistic and non-autistic forms of perception. As humans have gained a growing awareness of the structural potential contained within their surrounding environment, they have increasingly reconstructed that environment in countless and complex ways. These artificial innovations embody the advancements in human understanding and carry forward their structural underpinnings to future generations, leading to the multi-faceted and intricate surroundings in which people live today. Human experience now reflects a thorough blending of its two major sources of influence: one, the social, biological and communal aspect that arises out of humanity's animal past, and two, the artificial, structural and revolutionary aspect that has been introduced via the presence of the autistic population.

This description of autism illuminates the essential tension underlying scientific revolutions, with each pole of that tension corresponding to a particular perceptual type. Biologically typical perception underlies the communal and conservative qualities that define the normal science of stable paradigms, and autistic perception sparks the idiosyncratic and iconoclastic inspirations that drive abductive-style paradigm shifts in scientific revolutions. Both poles of this tension play a critical role in the maintenance and advancement of human understanding, with the non-autistic tendencies of normal science serving to buttress and to promulgate knowledge already gained, and with the autistic tendency towards atypical perception serving to demolish and to reconstruct knowledge in need of transformation.


5. The Atypical Individuals of Scientific Revolutions

It is important to recognize that in the modern world there is really no such thing as a purely non-autistic or purely autistic adult individual. Each person has a natural preference—determined mostly by how that person first achieved a sensory grounding—but as each individual matures, he or she will be exposed to a human environment thoroughly suffused with both biological/social influences and also with artificial/structural influences, and will through this exposure gain increasing familiarity and dexterity with both the non-autistic and autistic perceptual traits. This is why a non-autistic individual can become extremely fluent in all manner of artificial and structural endeavor, and it is also why an autistic individual can achieve closer connection to the human species and become accomplished within the social realm. And in scientific practice, no individual is precluded from either of the counterbalancing roles—each individual is capable of engaging in normal science or in scientific revolution, or in both. The distinction is at the perceptual level and not at the level of the individual.

Nonetheless, it can be expected as a general rule that each individual will gravitate more frequently to his or her natural perceptual stance. For instance, the non-autistic individual is more likely to feel at home in the presence of other people, and the non-autistic scientist is more likely to be drawn to the communal and conservative aspects of normal science. At the same time, the autistic individual is more apt to take solitary comfort in the regularity of structured surroundings, and the autistic scientist is more apt to be drawn to the worldview-altering potential of abductive reasoning. Thus, it can also be expected that over the course of scientific history, the aha moments of scientific revolution will have been generated more frequently by individuals giving evidence of possessing autistic-like traits, and indeed, scientific history gives abundant evidence that this is in fact the case.

Newton, Einstein, Darwin, Cavendish, Dalton, Dirac, the Curies—the personalities that emerge from the biographies of such individuals stand out in several respects, including being remarkably similar to one another and being classifiable by a telltale collection of traits: shy, taciturn, socially awkward, intensely focused, late talking, habitual in routine, echolalic, etc. (James, 2003). Indeed, there is not one social butterfly to be found anywhere upon this list. Autism was not a known concept when these individuals lived, but if they were among the population today, their spectrum-like characteristics would be difficult to ignore. This is not a definitive proof that autism can be directly applied to such individuals or that autism was solely responsible for their innovative feats—retrospective application of autism should always be approached with caution and care. But the consistency in the atypical traits of so many individuals has to be more than mere coincidence, and at any rate, the hypothesis can still be put to a future test. There will be future aha moments, and there will be future knowledge revolutions, and with autism now more recognizable within the population, it will be worth some effort to observe how many of these future cases of knowledge revolution come with autism conspicuously nearby.

Although it has become customary to explain the atypical characteristics of history's scientific icons as the by-product of their prodigious genius, there is in fact no reason not to consider the opposing interpretation, that the cause and effect at work here actually runs in reverse.


6. The Structure of Scientific Stagnation

The normal science depicted in Kuhn's 1962 work reflects a remarkably keen eye for the scientific practice of Kuhn's day. Having originally studied to be a practicing scientist himself, Kuhn manages to capture accurately the many mechanisms helping to form and to maintain the scientific community of the 1950s and 1960s: conferences, textbooks, journals, academic associations, specialty groups, and so on. Unfortunately, Kuhn then seems to apply this milieu to much earlier times, with an intimation that Newton, Darwin, Einstein and others performed their work under similar circumstances. This is an anachronism.

Before the twentieth century, scientific community had a much different meaning than it had for Kuhn, or than it has today. During those earlier times, scientists worked almost exclusively as individuals, and sometimes in great isolation. Textbooks were essentially nonexistent, and journals were used not for publication acclaim but instead as a more efficient means of sharing results and ideas than could be had through the redundancy of multiple correspondences. Scientific associations, such as the Royal Society, were relatively few in number, and by and large they kept their doors open to the public, serving as an opportunity for both enthusiasts and dabblers to come together (Schofield, 1963). Science was not then a lucrative profession, in fact quite the opposite. The biographies from those earlier times are filled with anecdotes about struggling to make ends meet and about entering the profession against the express wishes of family, more in favor of the financial security to be had with something like business or law. To be a scientist back then was to be literally not normal, and thus it would not have been surprising to find science's ranks permeated with a fair number of atypical individuals.

Those circumstances began to change during the nineteenth century, and that change accelerated rapidly at the beginning of the twentieth century. Spurred by the needs of both war and commerce, governments and businesses alike began putting much greater value on scientific work, elevating the profession to both higher status and higher pay (Agar, 2012). This attracted a different kind of scientist, one who would not have been comfortable at all within a neglected isolation, but who was perfectly at home inside a lauded and burgeoning crowd. Scientists now worked less frequently as individuals and began forming into ever enlarging teams. Scientific method transitioned into codified standards of practice. And scientific associations became more insular and more elite. This was the scientific community Kuhn was intimately familiar with, still with a hint of maverickness from its former days of revolutionary glory, but also settling rapidly into the large and regulated practice Kuhn identified as normal science. What Kuhn did not seem to appreciate was that this particular form of normal science was only recent in its origin, and was not applicable to earlier times, and this led also to Kuhn failing to anticipate that this form of normal science would become increasingly entrenched, rigid and homogenous by the end of the twentieth century.

Whereas science prior to the 1900s was receptive to an autistic-like individual, the science of the 2000s has become a hegemony of the biologically typical. Its ranks are now overflowing with the decidedly non-autistic, and almost every aspect of modern science serves to foster the communal and the conservative: affixation to a research team is currently de rigueur, publication has become a mass competition for citations, and success is measured primarily in the size of research grants. In such a system, there is no place for an autistically minded individual to find a productive or comfortable home. Not in the selection of the most well-connected mentor upon entering graduate school. Not in the paying of homage to one's superiors through a stream of obsequious references. Not in the groupthink sessions of one's ever present team. The autistically minded individual, the one who has a natural proclivity for those individualistic aha moments of abductive reasoning, that individual has been systematically elbowed out from the community, or else has been forced to subsume his or her tendencies under a mountain of normative rules. Try to imagine a young patent office clerk with a nonconforming notion about space and about time, try to imagine that individual getting published today, or being noticed by the scientific community at all.

The consequence of this transformation is of course inevitable—the notion of scientific revolution has almost entirely disappeared. It is not apparent that there has been any notable knowledge innovation over the last seventy-five years, and it seems every current form of scientific endeavor is in a state of being perpetually stuck. Consider human intelligence research and its continuing befuddlement over the Flynn effect. Consider autism science and its ongoing obsession with medical cause. Consider that king of the sciences, the domain of physics, and its endless dysfunctional marriage with string theory. Or try this: browse the historical list of Nobel prizes, a list that in the early 1900s was marked with the individual names of Planck, Bohr, Rutherford, de Broglie and Einstein, and in the early 2000s has turned into an annual celebration of research teams and academic press releases.

Fortunately, humanity need not despair over these circumstances. There will still be knowledge revolutions and there will still be constructive advancement in human understanding, even if those revolutions and that advancement must come from someplace else than the scientific community. Or perhaps that community will come to recognize its current state of crisis and will begin the search for a self-correcting paradigm shift. The exact details of such a shift remain uncertain, but its general outcome can be anticipated: a return to something more akin to former productive times, when there was still the essential tension between science's two counterbalancing poles, when there was still a symbiotic relationship between the non-autistic and autistic forms of perception.




References

Agar, J. (2012). Science in the Twentieth Century and Beyond. Cambridge: Polity.

Asperger, H. (1944) Die "Autistischen Psychopathen" im Kindesalter. [The "Autistic Psychopaths" in Childhood]. Archiv für Psychiatrie und Nervenkrankheiten, 117, 76-136. https://doi.org/10.1007/BF01837709

Baker, D. P., Eslinger, P. J., Benavides, M., Peters, E., Dieckmann, N. F., & Leon, J. (2015). The cognitive impact of the education revolution: A possible cause of the Flynn Effect on population IQ. Intelligence, 49, 144–158. https://doi.org/10.1016/j.intell.2015.01.003

Ballan, M. S., & Hyk, J. C. (2019). Autism spectrum disorders. In J. C. Raines (Ed.), Evidence-based practice in school mental health: Addressing DSM-5 disorders in schools (pp. 91–130). Oxford University Press. https://doi.org/10.1093/oso/9780190886578.003.0003

Barber, B. (1961). Resistance by scientists to scientific discovery. Science (New York, N.Y.), 134(3479), 596–602. https://doi.org/10.1126/science.134.3479.596

Buxton, V. L., Enos, J. K., Sperry, J. H., & Ward, M. P. (2020). A review of conspecific attraction for habitat selection across taxa. Ecology and evolution, 10(23), 12690–12699. https://doi.org/10.1002/ece3.6922

Campos, D. G. (2011). On the distinction between Peirce's abduction and Lipton's Inference to the best explanation. Synthese, 180(3), 419–442. https://www.jstor.org/stable/41477565

Cattane, N., Richetto, J., & Cattaneo, A. (2020). Prenatal exposure to environmental insults and enhanced risk of developing Schizophrenia and Autism Spectrum Disorder: focus on biological pathways and epigenetic mechanisms. Neuroscience & Biobehavioral Reviews, 117, 253-278. https://doi.org/10.1016/j.neubiorev.2018.07.001

Clark, C. M., Lawlor-Savage, L., & Goghari, V. M. (2016). The Flynn effect: A quantitative commentary on modernity and human intelligence. Measurement: Interdisciplinary Research and Perspectives, 14(2), 39–53. https://doi.org/10.1080/15366367.2016.1156910

Dickens, W. T., & Flynn, J. R. (2001). Heritability estimates versus large environmental effects: the IQ paradox resolved. Psychological review, 108(2), 346–369. https://doi.org/10.1037/0033-295x.108.2.346

Earman, J., Glymour, C., & Rynasiewicz, R. (1982). On Writing the History of Special Relativity. PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association, 1982(2), 403–416. https://doi.org/10.1086/psaprocbienmeetp.1982.2.192433

Einstein, A. (1905). On the Electrodynamics of Moving Bodies. Annalen der Physik, 17, 891-921.

Evans B. (2013). How autism became autism: The radical transformation of a central concept of child development in Britain. History of the human sciences, 26(3), 3–31. https://doi.org/10.1177/0952695113484320

Flynn, J. R. (1984). The mean IQ of Americans: Massive gains 1932 to 1978. Psychological Bulletin, 95(1), 29–51. https://doi.org/10.1037/0033-2909.95.1.29

Flynn, J. R. (1987). Massive IQ gains in 14 nations: What IQ tests really measure. Psychological Bulletin, 101(2), 171–191. https://doi.org/10.1037/0033-2909.101.2.171

Goldberg, S. (1970). In Defense of Ether: The British Response to Einstein's Special Theory of Relativity, 1905-1911. Historical Studies in the Physical Sciences, 2, 89–125. https://doi.org/10.2307/27757305

Griswold, A. (2007). Autistic Symphony. iUniverse. (accessible online: https://www.grizzalan.com/autisticsymphony )

Griswold, A. (2023a). Autistic Rhapsody. iUniverse. (accessible online: https://www.grizzalan.com/autisticrhapsody )

Griswold, A. (2023b). A Field Theory of Human Intelligence. https://doi.org/10.31234/osf.io/vhwfm

Haas, R. H., Townsend, J., Courchesne, E., Lincoln, A. J., Schreibman, L., & Yeung-Courchesne, R. (1996). Neurologic abnormalities in infantile autism. Journal of child neurology, 11(2), 84–92. https://doi.org/10.1177/088307389601100204

Hazen, E. P., Stornelli, J. L., O'Rourke, J. A., Koesterer, K., & McDougle, C. J. (2014). Sensory symptoms in autism spectrum disorders. Harvard review of psychiatry, 22(2), 112–124. https://doi.org/10.1097/01.HRP.0000445143.08773.58

Hedger, N., & Chakrabarti, B. (2021). Autistic differences in the temporal dynamics of social attention. Autism : the international journal of research and practice, 25(6), 1615–1626. https://doi.org/10.1177/1362361321998573

Hodges, H., Fealko, C., & Soares, N. (2020). Autism spectrum disorder: definition, epidemiology, causes, and clinical evaluation. Translational pediatrics, 9(Suppl 1), S55–S65. https://doi.org/10.21037/tp.2019.09.09

James I. (2003). Singular scientists. Journal of the Royal Society of Medicine, 96(1), 36–39. https://doi.org/10.1177/014107680309600112

Jensen, A. R. (1998). The g factor: The science of mental ability. Praeger Publishers/Greenwood Publishing Group.

Jung, R. E., & Haier, R. J. (2007). The Parieto-Frontal Integration Theory (P-FIT) of intelligence: converging neuroimaging evidence. The Behavioral and brain sciences, 30(2), 135–187. https://doi.org/10.1017/S0140525X07001185

Kaiser, D. (2012). In retrospect: The Structure of Scientific Revolutions. Nature, 484, 164–165. https://doi.org/10.1038/484164a

Kanner, L. (1943). Autistic disturbances of affective contact. Nervous Child, 2, 217–250.

Kern, J. K., & Jones, A. M. (2006). Evidence of toxicity, oxidative stress, and neuronal insult in autism. Journal of toxicology and environmental health. Part B, Critical reviews, 9(6), 485–499. https://doi.org/10.1080/10937400600882079

Kern, J. K., Trivedi, M. H., Garver, C. R., Grannemann, B. D., Andrews, A. A., Savla, J. S., Johnson, D. G., Mehta, J. A., & Schroeder, J. L. (2006). The pattern of sensory processing abnormalities in autism. Autism : the international journal of research and practice, 10(5), 480–494. https://doi.org/10.1177/1362361306066564

Klein, R. (2002). The Dawn of Human Culture. New York: Wiley.

Klein, R. G. (2009). The human career: Human biological and cultural origins. University of Chicago Press.

Kuhn, T. S. (1962). The structure of scientific revolutions. University of Chicago Press: Chicago.

Kuhn, T. S. (1978). The essential tension. Philosophy of Science 45 (4):649-652.

Lorentz, H. A. (1904). Electromagnetic phenomena in a system moving with any velocity less than that of light. Proc. Acad. Science Amsterdam, IV, 669–78.

Lycke, H. (2012). A Formal Explication Of The Search For Explanations: The Adaptive Logics Approach To Abductive Reasoning. Logic Journal of the IGPL 20 (2):497-516.

Lynn, R. (1989). A nutrition theory of the secular increases in intelligence; positive correlations between height, head size and IQ. British Journal of Educational Psychology, 59(3), 372–377. https://doi.org/10.1111/j.2044-8279.1989.tb03112.x

Masterman, M. (1970). The Nature of a Paradigm. In Lakatos, I. & Musgrave, A. (Eds.) Criticism and the Growth of Knowledge. London, Cambridge University Press.

Mcauliffe, (2015). How did Abduction Get Confused with Inference to the Best Explanation? Transactions of the Charles S. Peirce Society 51 (3):300-319. https://doi.org/10.2979/trancharpeirsoc.51.3.300

Michelson, A. A, & Morley E. M. (1887). On the relative motion of the Earth and the luminiferous ether. American Journal of Science, 34(203), 333–345.

O'Reilly, M., Lester, J.N., Kiyimba, N. (2020). Autism in the Twentieth Century: An Evolution of a Controversial Condition. In: Taylor, S.J., Brumby, A. (eds) Healthy Minds in the Twentieth Century. Mental Health in Historical Perspective. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-27275-3_7

Pan, P. Y., Bölte, S., Kaur, P., Jamil, S., & Jonsson, U. (2021). Neurological disorders in autism: A systematic review and meta-analysis. Autism : the international journal of research and practice, 25(3), 812–830. https://doi.org/10.1177/1362361320951370

Park, W. (2015). On classifying abduction. Journal of Applied Logic 13 (3):215-238. https://doi.org/10.1016/j.jal.2015.04.001

Peirce, C. S. (1992). The Essential Peirce, Volume 1: Selected Philosophical Writings (1867–1893). Indiana University Press. https://doi.org/10.2307/j.ctvpwhg1z

Peirce, C. S. (1998). The Essential Peirce, Volume 2: Selected Philosophical Writings (1893-1913). Indiana University Press. https://www.jstor.org/stable/j.ctt16gz4vr

Pietschnig, J., & Voracek, M. (2015). One Century of Global IQ Gains: A Formal Meta-Analysis of the Flynn Effect (1909–2013). Perspectives on Psychological Science, 10(3), 282-306. https://doi.org/10.1177/1745691615577701

Poincaré, H. (1900). The theory of Lorentz and the principle of reaction. Archives neerlandaises, V, 252-78.

Reis, S. M., Gelbar, N. W., & Madaus, J. W. (2022). Understanding the Academic Success of Academically Talented College Students with Autism Spectrum Disorders. Journal of autism and developmental disorders, 52(10), 4426–4439. https://doi.org/10.1007/s10803-021-05290-4

Reiss, A. L., Feinstein, C., & Rosenbaum, K. N. (1986). Autism and genetic disorders. Schizophrenia bulletin, 12(4), 724–738. https://doi.org/10.1093/schbul/12.4.724

Rylaarsdam, L., & Guemez-Gamboa, A. (2019). Genetic Causes and Modifiers of Autism Spectrum Disorder. Frontiers in cellular neuroscience, 13, 385. https://doi.org/10.3389/fncel.2019.00385

Sanbonmatsu, D. M., & Sanbonmatsu, K. K. (2017). The Structure of Scientific Revolutions: Kuhn's misconceptions of (normal) science. Journal of Theoretical and Philosophical Psychology, 37(3), 133–151. https://doi.org/10.1037/teo0000059

Schofield, R. E. (1963). Histories of Scientific Societies: Needs and Opportunities for Research. History of Science, 2(1), 70-83. https://doi.org/10.1177/007327536300200104

Schurz, G. (2008). Patterns of Abduction. Synthese, 164(2), 201–234. http://www.jstor.org/stable/40271057

Stachel, J. J. (2002). Einstein from "B" to "Z" / John Stachel. Birkhäuser.

Trahan, L. H., Stuebing, K. K., Fletcher, J. M., & Hiscock, M. (2014). The Flynn effect: a meta-analysis. Psychological bulletin, 140(5), 1332–1360. https://doi.org/10.1037/a0037173

Urbański, M. (2022). Evaluation of Abductive Hypotheses: A Logical Perspective. In: Magnani, L. (eds) Handbook of Abductive Cognition. Springer, Cham. https://doi.org/10.1007/978-3-030-68436-5_23-1

Woodley, M. A. (2012). A life history model of the Lynn-Flynn effect. Personality and Individual Differences, 53, 152-156. https://doi.org/10.1016/j.paid.2011.03.028