Wednesday, December 10, 2008

Kierkegaard as Educator

For that school girl clique of oh-so-cutting edge atheists (Dawkins, Hitchens, Harris, Dennett, and the remainder of their huddling masses)—I would have it take note of Kierkegaard, who standing all by himself, did not choose the cowardly path of abolishing religious thought, but bravely took aim at what religion had become.

I too—I have not the slightest desire to put an end to scientific practice, I merely draw a bead on what science has become.

Thursday, December 4, 2008

Autism as a Puzzle Piece

Autism advocacy and charity groups (such as Autism Speaks) like to depict autism as a puzzle piece. Oddly enough, so do I.

When I first learned about autism in detail, I was already forty-five years old and the subject had barely crossed my mind before that time. What had been on my mind over the years were questions regarding humanity and its universe, for you see I am one those poor souls prone to ruminate over such mysteries as language, intelligence, science, psychology and history. Those ruminations had taken me not all that far; it seemed I had barely formed the frame of some gigantic puzzle—one of utmost importance to be sure, but one also quite empty near its middle, a crucial hole right there at its center.

Then without warning, autism knocked at my door.

Like Suzanne Wright, I could have said autism had knocked on the wrong door, for given the way it was being introduced to me—as mental illness, as medical catastrophe, as a burden only autism advocacy and charity groups could offer to remove—viewed like that, autism clearly had no bearing on my unfinished puzzle at all. But at forty-five years of age, I had no preconceived notions, I had no existing stake in autism’s conventional wisdom; so I felt at liberty to examine the subject from all sides, felt audacious enough to dare to flip it over. And lo and behold—now no longer seen as mental illness, no longer viewed as medical catastrophe, no longer taken as the pity fodder for autism advocacy and charity groups—autism snapped cleanly into place, the snuggest puzzle piece I could ever hope to find.

Sunday, November 23, 2008

The Simple Greeting Exchange

“How are you?”

“Fine, thank you.”

The simple greeting exchange—which technically does not constitute a meaningful use of language—serves as fertile ground for understanding some fundamental differences between autistic and non-autistic perspectives, and also serves as an occasion for respecting and valuing those differences.

The simple greeting exchange, especially as practiced by non-autistic individuals, is the quintessence of biological immediacy. If the verbal aspect of the exchange can be said to be about anything, it is about what is already biologically present—two members of the species Homo sapiens interacting within the same immediate time and space constraints as the conversation itself. That is why it can be said that technically speaking, the exchange is not a meaningful use of language. Language is ordinarily the use of a biologically immediate artifact (spoken sounds, for instance) to represent something that is not biologically present (an event removed in space or time, for instance). In the simple greeting exchange, everything that needs to be conveyed is already present, so of what purpose is the language?

In the simple greeting exchange—with the exception of the words—everything is ancient and complex. Such exchanges have been taking place on this planet from almost the beginning of biology itself, and although each exchange happens in no more than an instant, each occurrence also conveys a cornucopia of species-driven information forged from the long-burning furnace of evolutionary time. Observe two members of almost any animal species as they come together—ants along the trail, lions in their den, barracuda on the prowl—each exchange is just as eloquent, just as informational, as any end-of-the-month business transaction. In mammals—and in primates especially—this exchange is brought about by means of a precise set of sensory-based conventions—eye contact, body posture, clucks and coos, nuzzles, sniffs, licks, etc. Not a word ever needs to be spoken; and in all the other animal species, not a word is spoken. The other animal species have no conception of a yesterday or a tomorrow, no conception of a mile to the east or a mile to the west; so the other animal species have no need for language. And least of all do they have need for language during the simple greeting exchange—the epitome of conversational biological immediacy.

Humans were like this once too; and if we remove their words, we see that they still are.

Language was a late arrival on the human scene, and its purpose is far removed from being an aid to conversational biological immediacy. If the words of the simple greeting exchange can be said to have meaning beyond just their immediate occasion (an occasion that needs no words), it might be said that they serve as a marker, an indicator—they mark the occasion of a biologically immediate conversation. Humans—and in particular non-autistic humans—take the words of the simple greeting exchange as the signal the intricate dance of eye contact, body posture, etc. has begun. Or to give the notion more sophistication, humans might be said to be using the simple greeting exchange as an indicator that this is not an instance of biological immediacy qua animal, but instead an instance of biological immediacy in the context of greater civility. (But then again, each participant was already aware of that.)

In the simple greeting exchange, the text dissolves to nothing, and the subtext expands to incorporate everything.

Autistic individuals often find themselves discomfited by the simple greeting exchange. One of the reasons for this is that autistic individuals will often take the simple greeting exchange quite literally—that is, they will take it as an expression of language. For instance, they understand “fine” to be an accurate report of the other person’s status, only to discover perhaps that circumstances are otherwise (“Couldn’t you see the anger in my face?” the other will say). In response to “How are you?” or “What’s been happening?”, autistic individuals will often provide a detailed and factual account of their recent situation—spatially, temporally and logically arranged—and will find themselves bewildered when they realize the other person is bewildered by the reply.

Language, by its original intent, bridges the gap between biological immediacy and the more remote realms of space, time and non-biological structure and pattern. Autistic individuals intuitively understand this, because cognitively speaking, they exist far more comfortably inside those remote realms.

“How many ceiling fans do you have in your house?”

“What an interesting question! I have three ceiling fans in my house.”

“What rooms are they in?”

“Let’s see…there is one in the living room, one in the bedroom, and another one on the porch.”

“Is the ceiling fan on the porch spinning?”

The autistic greeting exchange is a work of art, although it is seldom recognized as such. It is a work of art primarily because it uses language almost exclusively in its original and creative form—as a biologically immediate artifact intended to represent, or to inquire about, an event spatially, temporally or biologically removed. When the autistic greeting exchange goes well, an autistic participant feels informed, and thus also feels comforted and welcomed—the same feelings a non-autistic individual receives upon a successful simple greeting exchange.

In the autistic greeting exchange, the text encompasses everything, and the subtext disappears.

Imagine a behavioral speech therapist trying to teach the simple greeting exchange to a young autistic child—employing countless discrete trials, wondering why progress is so painfully slow, perplexed by how the skill does not transfer outside the training room. But if the therapist were instead to teach the child the autistic greeting exchange—incorporating ceiling fans, light switches, Thomas the Tank Engine, or whatever else might be of interest to this particular child—would not the child’s attention perk up almost immediately? Would not progress be considerably faster and the skills more widely practiced? “But that is not the goal,” the therapist will object. “The goal is not to have the child greet people with inappropriate banter about ceiling fans, light switches or Thomas the Tank Engine. The goal is to have the child greet people with the simple greeting exchange.”

“Inappropriate” is indeed the correct adjective here, but not applied to the banter. Is the aim of this therapy to teach the child the use of language, or to make the child indistinguishable from all his peers?

Autistic individuals will often bemoan the pettiness and insincerity of the simple greeting exchange, but that is a misunderstanding—they are overlooking billions of year of intricate and essential biology.

Non-autistic individuals will often decry the inappropriateness of the autistic greeting exchange, but that is also a misunderstanding—they are overlooking the glory, and the origin, of a creative use of words.

(Thanks to Bev at Asperger Square 8 for the inspiration.)

Wednesday, November 19, 2008

Good Housekeeping

I can hear a clamoring for more detail, for more evidence: outline your experiment, enunciate your theory, where is your data? But I cannot scurry fast enough from all that, I cannot step far enough away.

Thousands of researchers cook up millions of details each and every day, but who bothers to arrange the shelves, and who will tidy up the mess?

Sunday, November 16, 2008

Diagnosis is the Wrong Word

We say that autism is diagnosed, but the word diagnosis rests on a shaky foundation, the danger of which is never more apparent than when we promote early diagnosis for young children.

To employ the word diagnosis means to accept the assumption that autism is a medical condition or a mental illness, but this is precisely the assumption most in doubt. The medical community, which on one hand acts with unbridled certainty that autism is indeed a mental disorder, a brain-based illness, on the other hand admits the cause of autism remains entirely unknown and there is no known effective cure. So where does the certainty come from in the first place?

What if autism were not a medical condition, not a mental illness? Would we be able to discern that possibility, given that we have put the blinders on?

Those who speak favorably of early diagnosis for young children inevitably follow with the phrase early intervention. But how can early intervention be considered safe and effective if there is no known cause? How can early intervention be deemed appropriate if we remain uncertain of what autism is? The treatment of a non-existent illness is not necessarily benign.

Not that long ago the diagnosis of autism was an extremely rare event. Are we certain that those who went undiagnosed—and who therefore went untreated—are we certain their outcomes were inevitably tragic?

The correct word is recognition. We now know enough about autism to recognize it in certain individuals, including some who are very young. But recognition is all we have.

Wednesday, November 12, 2008


But we had already seen this medicine show! Hegel’s puffoonery once pulled the wool over the eyes of an entire academic generation and sent countless grandstudents into a Rube Goldberg slumber. Note the reaction of both Schopenhauer and Kierkegaard—cogent, inspired, extraordinarily well written, as though the point to be demonstrated above all else is that when one has something worthy to say, one takes the trouble to say it well.

Sunday, November 9, 2008

Universal Grammar

The term is correct, but universal grammar has been given a meaning that does not match what those two words actually say.

Noam Chomsky’s early work in linguistics deserves the highest praise. Before such efforts as Syntactic Structures and Aspects of the Theory of Syntax, linguistics was stuck in a quagmire of piecemeal analysis—an irrelevant quicksand of phonemes, morphemes and dead-end semantics. Chomsky unearthed language’s structural essence and gave it prominence and value, and his clever introduction of the tools of logic and recursive mathematics furnished linguistics with a language of its own, one that remains useful to the present day.

But Chomsky badly misguessed the source of language’s structural underpinning, and in fact it is a bit of a puzzle he had to make a guess at all. Having spent countless obsessive hours working out the many transformational rules of verbal syntax (his complete Logical Structure of Linguistic Theory is so massive I believe it has yet to be published in its entirety), Chomsky had ample opportunity to recognize that his linguistic schemas had much in common with the formal rules of physics, mathematics, logic, chemistry, digital electronics and so many other non-biological disciplines. Space, time, proof, natural law, formal syntax—these concepts are in many ways so structurally similar they border on being isomorphic; so how could Chomsky have failed to recognize that as he was sketching out the structure of human language, he was also sketching out the basic structure of the perceived world? But such was the allure of brain science dogma even some forty odd years ago—Chomsky turned to biology instead and posited an instinct for human language.

To be fair, Chomsky was handicapped by two critical pieces of evidence: one piece of evidence, in plain sight, proved to be an over-enticing red herring, and the other piece of evidence, far more useful and productive, was alas not available to Chomsky at all.

The red herring of course was the speed and seeming ease with which most children acquire a spoken language. That an instinct was at the heart of this process was undeniable, but first to be determined was whether an already known instinct could adequately account for the phenomenon. The young of many species pass through a relatively brief period of rapid assimilation of the species’ behaviors—learning to hunt, to find shelter, meld to the group social dynamics, etc.—all leading very quickly to adult-level skills in the areas of survival and procreation. What these maturational activities make clear is that it is not so much the activities themselves that are instinctual as is the pedigree of being intensely species aware and species assimilative. Acute species recognition, to the degree of nearly complete perceptual exclusion of all other sensory input, is the common evolutionary thread explaining how the young of so many species rapidly transform into nearly exact behavioral copies of all the other members of the population. And of course humans have been no different. When humans were once silent hunter-gatherers, their children rapidly matured into being exactly the same, taking on many fully developed roles by the age of puberty. And when humans swiftly transformed into being verbal and more civilized, the children did not skip a maturational beat, just as rapidly assuming the new set of common behaviors, and at a very early age. What most children have an instinct for is to do what other humans do; it has been that way for a very long time, and it still is.

But Chomsky was convinced language had to be something different. With page after page of formulas and recursions laid out before him, aware more than anyone of the complexity running throughout the entirety of semantics and syntax, Chomsky found it inconceivable so much surface variation and core structural similarity could be acquired quickly by species assimilation alone. In this, he was being hurt by his failure to see that it was not just language that was being acquired, but also all the behaviors, conventions and perceptions mirrored inside of language, along with their corresponding degrees of structure and complexity. For Chomsky, language seemed to be monolithic, independent of the other new aspects of human behavior—independent of the changing human environment. Furthermore, language seemed to be something that had to be inherently unique to this one species alone. Hunting behaviors, sheltering behaviors, hierarchal behaviors—these too are extraordinarily complex and their quick absorption is no less amazing than the absorption of language; but with thousands of other species serving as example, and with the long reach of evolutionary time helping to soothe any concerns over how such behavioral complexity might be taken on, a scientist is less apt to doubt the species assimilative forces when applied to such time-honored and widely distributed skills. Not so language, the late-arriving exemplar nonpareil. And even supposing Chomsky could have brought himself to accept that language might be absorbed by the usual species assimilative means, this would only have raised a much larger question in his head: where did language come from in the first place? Having appeared quite suddenly on this planet, and having arrived nearly full blown as it were, like Athena from Zeus’s head, language could not chalk up its origin, at the very least, to some typical hand-me-down inter-generational event.

Faced with language’s seemingly unique standing in the biological world, and hampered by the unanswered question regarding its origin, Chomsky resorted finally to some scientific magic and proposed an entirely separate instinct for human language. Thus in one fell swoop Chomsky turned language into something biological, genetic, neural, evolutionary, and above all else, restrictively human. The term universal grammar debuted as an ironic phrase, for now there was nothing universal in the concept at all.

The more productive piece of evidence Chomsky did not have access to was an accurate description of the condition known as autism. Autism of course was known in the 1960s and 1970s, but at that time was regarded as little more than a medical catastrophe, its gravity compensated for only by its extreme rarity. Those few autistic individuals who were recognized in Chomsky’s day, both from the acuteness of their condition and from the cruelties likely being perpetrated upon them, would not have been able to provide many useful clues in a study of general linguistics. It would take at least another twenty years before the medical community would begin to realize autism was a condition not necessarily so devastating—and uncoincidentally, not all that uncommon—and of course even to the present day the medical community continues to struggle under the delusions from that misguided past.

Autism, when more accurately described, tells a much broader story than has been heretofore considered—a story touching directly upon, among many other things, the history and construction of human language.

Fundamentally, autistic individuals possess a significantly less degree of species recognition and species assimilative capacity than do most other humans (and indeed, than do most organisms). For yet unknown reasons, autistic humans do not readily perceive the human-specific features of their sensory environment, and in consequence they do not easily assimilate to the species itself. Therefore, initial autistic sensory perception goes forth mostly ungrounded, and early autistic cognitive development must run a gauntlet of a nearly overwhelming sensory chaos. In compensation and in varying measure, most autistic individuals form their cognitive grounding instead out of the non-biological features that inherently stand out from the surrounding environment—perceptions based upon symmetry, pattern, structure, detail, repetition, and the like. The unusual early behaviors of autistic children are chock-full of the consequences from these unique forms of perception and cognition, and the ongoing behaviors of nearly all autistic individuals—from childhood through maturity—show marked preference for the more orderly, non-biological aspects of the objective world than for the social, biologically-based features preferred by the human population at large.

Space, time, logic, mathematics—these concepts, representing the structural framework of the objective world, were first introduced into the human species through the medium of autistic perception, and they are the by-product of a compensatory form of autistic cognition that finds its essential grounding in the symmetries and patterns to be found in the surrounding environment.

Autistic individuals, however, despite their non-biological cognitive grounding, are biological creatures themselves and are therefore subject to the same experiential restrictions as any other organism. Space, time, logic, mathematics—these concepts cannot be directly grasped by immediate perception alone, they are not inherently part of immediate biological experience. To bring non-biologically based perception into the realm of immediate biological experience requires the aid of an intermediary; it requires the use of an artifact that can be immediately perceived but which also serves the purpose of representing something not biologically present. This intermediary is precisely that object we call language, and if autistic individuals have been responsible for introducing the realm of non-biological pattern and structure into the human species, they have also been responsible for bringing along its essential companion—they have been responsible for the introduction of language.

If Chomsky had been able to surmise this autism-inspired origin of human language, then perhaps he would have been less mystified by language’s ubiquitous structure.

As the early artifacts of human language—abstract gestures to some degree, but primarily spoken sounds—as these began to circulate around the globe, they quickly diverged in both vocabulary and surface form. But as Chomsky has rightly noted, the underlying structure of human language changed hardly at all, never varied in any appreciable degree from tribe to tribe, place to place, generation to generation. This split between language’s surface presentation and its underlying structural form captures exactly that distinction between the arbitrary nature of the object doing the representing, and the far more determinant nature of the object being represented. Only the artifacts of language can be indeterminate, only they can take on a nearly unlimited variety of form: hundreds of spoken languages, thousands of individual dialects, written and encoded extensions (shuffling language across the expanses of space and time), signs and symbols, charts and schematics. As humans have so amply demonstrated, almost any sense-perceptible item can serve the purpose of conveying a language—all that is required is some degree of convention—but if the artifacts of language can come from almost any perceivable source, what language represents is something entirely different. What language represents, by necessity and original intent, is something already perceptually determined.

Space, time, logic, mathematics, pattern, symmetry—these concepts, representing the form of the objective world, are precisely those concepts that must be reflected inside language’s foundational structure. Object and concept, noun and verb, temporal tenses, spatial adjectives, all manner of nuanced prepositional form—as autistics brought to humanity the patterned structures from the surrounding world, they also brought to humanity the conveying mechanism that by necessity had to assume that world’s inherent organizational form. There is no need to posit a genetic, biological or neural instinct to explain language’s ubiquitous structure: one need only look to the pattern and symmetry of the perceived world and realize that language has no choice but to be its mirror. And one need not confine language to the human species alone: any life-form open to the non-biological patterns of the surrounding environment will by necessity find itself relying upon the mechanisms of a deeply foundational language, because biologically speaking, there are no alternative means.

And so indeed, human language has been framed by a universal grammar—far more universal than Chomsky ever managed to conceive.

Wednesday, November 5, 2008

Music and Raven’s Progressive Matrices

Elsewhere I have noted that the Raven’s Progressive Matrices intelligence test may be the nearest thing humans have to a pure time and space test. The Raven’s temporal/spatial problem domain, natural to autistic perception and cognition, helps explain why autistic individuals perform so differentially well on that particular test.

Yet now that I think about it, there is at least one other human endeavor that shares a similar structure to the Raven’s test—namely music. The roles played by rhythm, melody and harmony are quite analogous to the roles played by time, space and conceptual pattern in Raven’s, and the well-documented affinity and natural ability for music that many autistic individuals display is further suggestive of an underlying connection.

Time, space, geometry, arithmetic, logic, melody, harmony, rhythm, games, rules, syntax—these concepts run along the contours and fault lines of autistic cognition, and understanding their connection helps to highlight the nature of autistic perception and to characterize its contribution to human endeavor. Music and the Raven’s intelligence test are not random gewgaws from the stream of life—their prominence derives from a correlation to the forces driving the human species forward.

Sunday, November 2, 2008

Health Coverage

Many efforts are currently underway to mandate insurance and state coverage for the various types of autism medical treatment. These efforts are premature.

Not in any way certain of the condition it is dealing with, and ignoring its charge to first do no harm, the medical community is offering a scattershot and dangerous approach to autism treatment—oppressive applied behavioral analysis, overpowering pharmacology, wide-ranging interventive therapy, and a few doses of biomedical quackery thrown in for good measure. What these treatments have in common is that none are designed to promote autistic capacity; all are designed to shut it down.

When attitudes have changed, when humanity has examined autism for what it truly is, when medical efforts have turned from suppression and cure—and towards autistic achievement—only then will autism coverage become a worthwhile investment, and not the drain of resources and human dignity that it currently is.

Wednesday, October 29, 2008

When Science Goes a Whoring

It is not for idle purpose I shine light on the community of autism research journals. The attitudes backing practices of piled-on authorship, sycophantic peer review, citation back-scratching and editorial board nepotism are the very antithesis of science. (And I have yet to mention the harm being done to autistic individuals.)

Sunday, October 26, 2008

The Dual Root of Human Language

Human language has shot forth from two distinct roots, from two different sources of fundamental influence.

The first source has been the temporal, spatial and logistical pattern that constitutes the order of the surrounding, non-biological world. This influence is seen most clearly in language’s underlying structure: object and concept, noun and verb, temporal tenses, spatial adjectives, and all manner of nuanced prepositional form. This particular aspect of language did not arise from humanity’s biological and evolutionary past, but instead came from out of the struggles of an unusual interloper; it came from the autistic perceptions and cognitions that have gained foothold within the human population. Autistic individuals, who by definition are not cognitively grounded by the usual species-aware perceptions, have created a cognitive grounding instead out of the patterns and symmetries to be found in the surrounding environment. But as biological creatures themselves, and needing to convey this unique form of perception both to themselves and to others, autistics have uncovered also the only biologically immediate means by which a non-biologically immediate perception can be represented—they have uncovered the essential accompaniment called language.

However, as has happened on many other occasions of autistic discovery and introduction, language was quickly adopted, transformed and widely spread by the more plentiful non-autistic members of the human population, and thus human language soon acquired a significant second root. This influence appears most noticeably in the core vocabularies of the world’s languages, which are dominated by words, phrases and metaphors derived out of humanity’s evolutionary and animal past, revealing that as the majority of humans were introduced to the possibilities of language, they quickly adapted its content (autistics might say they corrupted its content) to reflect those features of existence more natural and essential to them—the businesses of eating, excreting, tribalizing and procreating. Furthermore, the core content of human language is also that aspect most often accompanied by non-verbal cues and subtext, suggesting that not that long ago mankind conducted its survival and procreative business without the aid of abstract language, conducted it entirely through the medium of immediate gesture and inter-social behavior (much as the other animals do). Abstract language was the latecomer, and upon its introduction it was layered on top of an already extant form of biological communication, and both layers now play a crucial role in the immediate conversations of today. (And of course it is worth noting that it is the non-verbal, instinctively perceived aspect of human communication that autistic individuals have the most difficulty learning and mastering.)

In the early twenty-first century, human language remains a blend of its two sources of influence, and appears to resist most pulls towards becoming an homogenized mixture. The evolutionary, biological source of influence continues to hold prominence as the most frequently employed aspect of human language—from small talk to international diplomacy—and thus continues to serve its purpose of being the linguistic glue that helps hold the species together. But an examination of language’s accelerating number of changes and additions, especially those introduced over the last several hundred years, reveals how the autistic root of language has become increasingly more influential over time, threatening to regain once more what might be described as its birthright. The periodic table, representational painting, box scores, blueprints—one does not have to look far to recognize how the non-biological, non-evolutionary aspect of human language has been rapidly transforming the behavior of the species and reorganizing the manner in which it communicates. And consider the education of children—the majority of whom can pick up the core, biological aspects of language by the time they are five—but who require with each new generation more and more time, and a much greater variety of instructional technique, to absorb just a fraction of the new language and new communicative structure that has been added in recent years.

There is much that can be learned about the current status of our own humanity by teasing out the various structures and contents of the expanding forms of our human language, an analysis that becomes fruitful only after recognizing that human language did not spring from a single source alone.

Wednesday, October 22, 2008

Lost and Found

Professor Simon Baron-Cohen and his colleagues have written about today’s “lost generation” of adults who are currently being diagnosed, long after their childhood has ended, with various forms of autism (usually Asperger Syndrome):

But what of the generation who were born before 1980, who may have had Asperger Syndrome but for whom there was no diagnosis available? No specialist clinical teams, not even the concept of Asperger Syndrome. How did they fare? The answer is that they were overlooked, and struggled through their school years. And the reason we run a clinic for the very late diagnosis of Asperger Syndrome is because these are the lost generation: those who today would receive their diagnosis by 6 or 8 years old, if they were a 21st century child. They come to our clinic in young adulthood or even middle age, and they tell us a now-familiar story.

All through their school years they had trouble making friends or fitting in. Many were bullied by the other children, both physically and verbally. Many felt, in Claire Sainsbury’s chilling words, like “an alien in the playground”. (This is the title of her excellent book). The lucky ones managed to stay in school long enough to get their SATs, and some got to university. But not without feeling their teens were an uphill struggle. By young adulthood many had suffered clinical depression and even felt suicidal. All because their underlying condition of Asperger Syndrome had gone unrecognized and therefore unsupported. Some of them had enjoyed the closeness of an intimate relationship only for this to break down. Some had found employment only for them to run into problems in the work place through not understanding what the employer and other staff might expect of them, or through getting into conflict, or being passed over for promotion because of their lack of team skills.

But only a moment’s reflection reveals something grossly amiss in Baron-Cohen’s description. The above paragraphs imply there has been only one lost generation of autistic adults—the generation born before 1980—but how can that possibly be? What about the previous generation, say the ones who were born in the 1940s and 1950s? Or those born in the early 1900s? Or what about those born in the 1800s, and even much earlier than that? Does Baron-Cohen mean to suggest autism descended en masse upon humanity sometime in the late twentieth century, and are we meant to understand there were no autistic adults living among us until a few began showing up outside the door of his clinic, desperate for relief (so he says) from the wretched circumstances he and his colleagues would have thrust upon them?

This generation of autistic adults Baron-Cohen is attempting to describe is only the latest in an impressively long line of autistic generations, reaching far back into humanity’s ancient past. That autism has gone unrecognized for so long suggests exactly the opposite of what Baron-Cohen is trying to say—it suggests there has been nothing lost, or even wretched, about any of these prior generations. Far from being lost, the current generation of autistic adults is indeed the first to be found, found living quietly and productively among us, as autistic generations always have. But if Baron-Cohen and his colleagues must insist on hunting for those who are supposedly gone astray, might I suggest they expand their search beyond their clinic door. For out in nearly every street can be found a generation of autistic adults much larger than even Baron-Cohen has managed to conceive, a generation doing so well it would never think to bother with his self-deluded clinic.

As I have written elsewhere, autism has had a significant presence within the human population for a very long time—a presence mostly silent, but not without consequence.

Sunday, October 19, 2008

What If the Flynn Effect Has Ended?

Some recent studies have suggested the Flynn effect might soon be ending (at least in the Scandinavian countries), and several authors, including Professor Flynn himself, have latched onto this possibility like it were a lifeline being tossed to a drowning man. Those who have regarded the Flynn effect as mostly a twentieth century phenomenon—an historical anomaly, as it were—are experiencing near palpable relief at this hint the anomaly might soon be going away. Such a disappearance would end their fright, would end their gnawing fear that mankind is indeed growing ruthlessly more intelligent in a damn near inexplicable way.

But some of us are not so easily frightened. Those who have contemplated the entire history of mankind—from its animal-like existence not that long ago, through the sudden sprouting of complex civilizations beginning around six thousand years ago, to the franticly paced modern efforts to transform nearly every square inch of this entire planet (and those who have seen the imprint of the Flynn effect throughout that blazing history)—for us, any suggestion the Flynn effect might soon be ending (and coincidentally just now, right at the very moment of its discovery)—well, how are we to choke back our reaction without offending those who have become so terribly frightened, for really such suggestions are little more than laughable.

To be sure, over the past fifty thousand years the Flynn effect has certainly gone through some surges and ebbs. To focus on Western Civilization alone, the era of Ancient Greece, along with its enduring aftermath, would undoubtedly have been one of those periods during which the Flynn effect ran at peak. Just one perusal through the physical constructions of that age—the buildings, the written mathematics, the crafted and portrayed arts—and one sees an embodiment of pattern and form running far beyond anything mankind had ever experienced before. That embodiment, suffused throughout the populace and down through the generations, showered its re-creating and foundational intelligence across the Roman era and well into the first millennium A.D. It was not until the stretch of the Dark and Middle Ages that we discern a slowing down of this ordered construction—and only then a slowing down of the Flynn effect—but not a complete halting, mind you, Western Civilization at 1500 A.D. still was more rapid and complex than Western Civilization at 500 A.D. And elsewhere of course—in Byzantium, India and China—we find yet more examples of the Flynn effect crescendoing into various bursts of sudden and local bloom.

Beginning with the Renaissance, the pace of this structural change embodied into the human environment resumed once more its rapid acceleration, and over the last five hundred years, and across all manner of civilization, man’s temporal, spatial, and non-biological abilities and patterned formulations have been increasing at a nearly mind-breaking speed. The I.Q. tests of the twentieth century have captured only the most recent period of this ongoing phenomenon; if there had been intelligence tests available in all the previous centuries, the Flynn effect could have been discovered long before now. Professor Flynn has not stumbled onto anything new. He has stumbled only onto the most recent residue of a process that has been profoundly reshaping the human landscape from the time of the great leap forward, and if the impressive Flynn effect statistics from the twentieth century are to be telling us anything at all, it is that from sheer momentum alone the Flynn effect can be expected to remain with us, and sustain us, for a considerable time to come.

But allow me to offer a moment’s respite to those who are so terribly frightened. Let us consider for speculation’s sake what might happen if indeed the Flynn effect has ended. What this implies of course is that in theory any one of us would be able to score as equally well on any future-offered intelligence test (say those being sat for two hundred years hence) as we might tally on any of the currently offered standardized forms. This feat, we realize in retrospect, would not have been possible for those poor souls who lived in the early nineteenth century. Try imagining a man from the early 1800s nabbed suddenly by the scruff of his neck, hustled forward a couple hundred years or so, whisked by plane, taxi and elevator to a brightly lit, sharply cornered examination room, placed before a typed-out pamphlet of the strangest looking shapes and most oddly worded phrases—pen and stopwatch waiting impatiently there on the table beside him. What answers do we anticipate from such a man? What brilliance might we soon be expecting to hear, beyond, that is, his repeated stammering, “But dear sir, what exactly do you propose I do?”

Respite over. For honestly, how can we believe, frightened or not, that we ourselves would escape a similar fate? Try imagining yourself now, suddenly bolted forward towards the twenty-third century, hastened to an examination hall by powered means you cannot begin to describe, then suddenly strapped to a contraption of all manner of knobs and wires and switches (at least, those are the only words you can think of to describe such an oddly constructed panel), and now with flashes of three-dimensional light dancing all about you, with rapid questions poured upon you in a grammar you have never considered before, and all the while accompanied by frantic demands to respond with a quick jab of finger, a flicker of eyelid once or twice, or at least a simple grunt or two. And just about the time you manage to catch your breath, just about the time you gather enough wits to offer at least one feeble attempt, the lights of the examination hall suddenly darken, the rapid stream of questions comes to a jarring halt, and from out of the walls a stentorian, twenty-third century form of a tut-tut voice announces that your time is over and that your score has failed to register, at least on any significant range. Some intellect you turned out to be.

If the Flynn effect has ended, then so has the course of human progress. To embrace this absurdity would be to misperceive what the Flynn effect has been trying to tell us; it would be to misconceive the question, what is intelligence? There are no paradoxes to be explained away from increasing intelligence scores; there is only a befuddlement from brain science dogmas that have been turning our inquiries outside in. The Flynn effect compels us to remove intelligence from out our head and place it in surroundings where it more rightly belongs, in the structured landscapes we humans have been building all around us, and will continue to build for a considerable time to come. A world increasingly more spatial, more temporal; an environment always more patterned, more frenzied—and what wonder can there be that we require newer generations to absorb each change afresh, and leave all the ancestors behind?

The Flynn effect has been shadowing the path of our human journey; it has been marking the pace of our considerable advancement, the one taking us all the way from savannah-bound primate to questing knight of a massive universe. Intelligence is not to be found in packets of scores alone: cast an eye wider, cast an eye across the twentieth century’s entire vista—from horse-drawn buggy to rocket in sky, from ground-hugging hovels to skyscrapers knifing air, from shovels and guns and axes to computers and networks and drones. Cast an eye across that entire scene, then say with conviction that the show is about to end. The sudden halt would jolt us right out of our skin; the end of the Flynn effect could only mean the death knell of all mankind. Allow me to save my fright for that possibility alone—for humanity’s darkest age indeed.

Wednesday, October 15, 2008

The Telltale Origin of Time and Space

Recognition of non-biological pattern and symmetry is the basis of our notions of time and space, and this recognition is not biologically natural. Its first appearance on this planet occurred mere centuries ago, and there is no indication of such recognition in the cognition and behaviors of the other animal species, or in the cognition and behaviors of early humans.

Thus the word normalcy does not attach to temporal and spatial recognition. To witness a spontaneous, biological occurrence of such recognition, we would need to look more carefully at the cognition and behaviors of those children we have ironically labeled as disordered.

Sunday, October 12, 2008

Six Thousand Years Old

If we are speaking of the age of the world as mankind currently perceives it—with all our science and mathematics and literature and art and engineering and astronomy and so on—then the creationists are certainly being far more accurate in their dating techniques than the evolutionists are. Six thousand years ago, this species had made considerable progress stepping in off the hunter-gatherer’s grassy plain, but still lived much nearer to that plain than to the current forms of modern civilization. In Mesopotamia, in Egypt, in India, in China, dazzling flowers were on the verge of bloom, transformations our scientists still have not characterized accurately, the transformations having overwhelmed our scientists’ means. To say that the world was created almost miraculously starting around six thousand years ago would not be pressing credulity any appreciable degree—not if we are regarding that world through the gaze of our own eyes.

If the scientists can take any consolation from their miscalculation, it is that only they recognize flowers must have a context in which to bloom: to deny the steadfast background of the soil is no better than to suppress wonderment at the germination of the seed. The mistake of the creationists has been to ignore the very real context upon which this planet’s miraculous transformation took place, and the mistake of the scientists has been to remain dogmatically blind to the fact an anti-evolutionary transformation occurred at all.

Wednesday, October 8, 2008

The Common Wisdom

The theme binding these posts together is an unrelenting scrutiny of autism’s common wisdom.

You might think autism does not have a common wisdom, given the multitude of causation theories: genetic defects, brain abnormalities, environmental toxins, refrigerator mothers, vaccines, various combinations of all the above. You might think autism does not have a common wisdom, given the plethora of treatments: applied behavioral analysis, litanies of drugs, endless varieties of therapy, sensory integrations, biomedical concoctions, eugenics. You might think autism does not have a common wisdom, seeing as how we are as in the dark today as we were at the moment of autism’s discovery.

But autism does have a common wisdom. There is one assumption, one unquestioned belief underlying all these diverse theories and treatments, all this lack of progress. The one certitude so widely held is that autism is an indication of something gone medically wrong—an illness, a defect, a disorder, a blight mankind would be much better off without. That is the first step that altogether escapes notice. That is the initial turn in the wrong direction, the one leading right over the edge of a cliff.

In each entry here, you will find yet another attempt to shine light on that common wisdom—probing, exposing, holding the assumption up to the scrutiny of actual vision, holding the assumption up to the current status of our own humanity. With autism, we have talked ourselves into a pseudo-illness, and what a shame that must be for all, and what a lost opportunity for self understanding; for under the spell of common wisdom, we have failed to recognize how the current status of our own humanity has been built upon the back of that pseudo-illness.

I will be saying this again and again: autism’s common wisdom has been blind from its very beginning, and we will gain no understanding of this condition—and no understanding of ourselves—until we step back from the darkness of that abyss.

Sunday, October 5, 2008

Language and Biological Immediacy

Language is the use of a biologically immediate artifact to represent something that is not biologically immediate.

All organisms, including humans, exist only within biological immediacy. That which is removed in space, that which is remote in the past or future, that which is not graspable through innate biological capacity—such events cannot be directly engaged or experienced by any organism. Every sensation, every urge, every action must transpire in the biological here and now, and there is no alternative—that is the essence of biological experience. And indeed, until quite recently on this planet, that has been the limit of biological experience.

Although an organism has no means to step outside of its biological immediacy, humans have demonstrated that organisms can use the material of biological immediacy to represent something else, to represent that which is not biologically immediate. It is in this way that language serves as a bridge to previously unknown and always unreachable realms: using language, an organism remains within the required confines of its biological immediacy while using something inside that biological immediacy—the material of language—to represent something beyond its biological immediacy.

The material of language can be almost anything—humans began with abstract gestures and vocalizations, and have recently adopted materials that can be touched and more widely seen or heard. What is transformational and important about language is not the material itself but the elements and structure being represented—events of time, space and other non-biological conceptualizations. Although an organism can never directly engage such elements, it can use its belief in the value and accuracy of the language representing such elements to change the course of its own biological immediacy. And thus it is that humans have in large part separated themselves from their evolutionary animal past.

If one wants to understand the origins and structure of language, one does not focus on the material of language, since that material itself is almost completely arbitrary. (And one goes even further afield to focus on mentalizations or brain processes, since mentalizations are little more than the re-creation of the arbitrary materials of language.) If one wants to understand the origins and structure of language, one focuses instead on the non-biological elements being represented, and wonders about their sudden recognition and their peculiar form. After all, this planet passed more than four billion years without any species ever considering time, space or any other non-biologically immediate concept. And humans too, they passed hundreds of thousands of years completely oblivious to anything outside their biological immediacy.

Keep in mind: that which is far removed from biological immediacy is also far removed from normal biological perception.

Saturday, September 6, 2008

Brief Respite

This blog will be on break for about a month. Posts will resume around the first or second week of October.

Tuesday, September 2, 2008

The Blind Fighting the Blind

I have little to say about vaccines: this blog is about the subject of autism, and vaccines have nothing to do with the subject of autism. But unfortunately, the society around me cannot seem to stop talking about vaccines and autism, so perhaps from my distress at the level of noise, I am going to make a candid observation about the two main parties to this so-called debate.

The current focal point of the autism-vaccine wars is the Autism Omnibus Proceeding, a set of hearings in which a rabid mob of charlatans, shrews, shriekers and legal bottom feeders have engaged themselves in an everything-but-the-kitchen-sink effort to extract a large sum of settlement money from the United States Treasury, and in which a team of respondent attorneys representing the Department of Health and Human Services—and by extension much of the established scientific community—has taken on the unenviable task of fending off this mob. But as it is, I can find almost no one to cheer for in this battle royal, although if pressed I might be willing to express a little sympathy for the government lawyers. Faced with an onslaught of crackpot theories, off-the-cuff evidence and emotional appeals to everything except the facts, the government legal team finds itself forced into adopting a strategy that at first glance might seem to be the most prudent, the most promising, and indeed, in a practical sense, is most likely the best. They counter this torrent of irrationality by responding with the established tenets and practices arising out of the autism research and medical community—they call on that community’s experts to offer a reasoned defense.

Ah, but there is the rub.

If this were a question of calling on scientific experts with an established track record of aid and understanding for the autistic individuals in their care, then indeed a reasoned defense would have a most salutary effect. But the established track record of the autism research and medical community—more than sixty years in its making—is now one of the most abysmal in the land. Indeed, the very concepts of aid and understanding for autistic individuals seem to have barely scratched the surface for this community, except perhaps as occasional lip service, for on its juggernaut path to becoming the medical profession’s preeminent growth industry, the autism research and medical community has shown little patience for anything, or anyone, hampering the course of its rapid expansion. Thus it is we see that community amassing positions, grants, chairs, journals, expensive equipment and massive hospital wings, while for the autistic individuals in its care we see it amassing overpowering drugs, invasive therapies, haphazard genetic testing, and a relentless confirmation of the words illness, defect and burden. In this age of heightened awareness of autism, in this era of expanding diagnoses stoking the growth industry’s flames, we see the autism research and medical community reaping all the rewards of the added attention and additional resources, while the autistic individuals themselves are reaping the back of that community’s hand.

To employ an example from the omnibus hearings themselves, we have the case of Dr. Bennett Leventhal, an expert witness brought forth by the government to counter the now stale argument that thimerosal in vaccines has spawned an epidemic of regressive autism (whatever that may be). To his credit, Dr. Leventhal manages to swat away the thimerosal hypothesis with the greatest of ease—for after all, a six year old child can swat away a fantasy. But before passing out handshakes, cigars and pats on the back all around, let us consider what else Dr. Leventhal manages to accomplish during his self-assured time upon the stand. He manages to accomplish a good deal of myth promulgation (mental retardation is co-morbid with autism 70-80% of the time; parents with autistic children have unusually high divorce rates). He manages to accomplish the all-too-common disparagement of autistic children as burdens to bear and a gigantic stress upon the family. He manages to accomplish the promotion of his profession’s superior “gold standard” tools and brand new centers of excellence—despite his profession’s dreadful history in exactly these areas—and he manages to accomplish a subtle plug for the use of risperidone in treatment. Perhaps it is just me, but more than anything during his brief and authoritative time upon the stand, Dr. Leventhal manages to accomplish—do you get this whiff too?—an air of arrogance for himself and a massive dose of condescension for all his patients. And of course it goes without saying that in the autism research and medical community, there are literally thousands and thousands of Dr. Leventhals.

The question must be asked: how are the subtly malignant efforts of such so-called experts any the less harmful to autistic individuals than the more obvious quackery of, say, Messieurs Geier and Geier?

In this battle of the vaccine-hating militia versus the scientific establishment, we have a perfect example of the blind fighting the blind. If we must insist upon supplying uniforms to tell the two sides apart, then I suppose we can characterize the vaccine haters as the willfully blind, and the autism research and medical community as the ignorantly blind, but the added distinction does little to make the battle any the more compelling.

Wednesday, August 27, 2008

Evolutionary Theory versus Creationism

When I happen upon two third-graders engaged in a heated debate about whose dad is the best, I walk on past. I’m an adult now.

Saturday, August 23, 2008

Faint Praise for the Brain

A healthy human brain is essential for the manifestation of human intelligence. But then again, a working ignition system is essential for a drive to the grocery store.

Today’s neuroscientists, cognitive researchers, evolutionary psychologists, etc.—they need re-schooling in the concepts of necessity and sufficiency.

Monday, August 18, 2008

The Many Autisms

The research community’s trend du jour is to characterize autism not as a single condition, but instead as a term embodying many distinct conditions, each with separate etiology (albeit unknown). Ami Klin, David Amaral, and Francesca Happé, for instance, have lent their support to such percolations.

Fracturing autism into more pieces might indeed prove useful in expanding the number of grant applications and research chairs, but if we hope to expand our understanding of autistic individuals, attention should be turned in the opposite direction. We already have too many autism diagnoses at our disposal—Autistic Disorder, Asperger’s syndrome, pervasive developmental disorder (and all the informal variations thereupon). What have these distinctions gained for us other than increased confusion?

If instead, autism’s umbrella were put back together and expanded to include similar conditions such as schizophrenia and bipolar disorder, we might eventually get around to asking some important questions, such as why do certain individuals perceive their world differently than the norm, and what are the consequences?

In retrospect, this fashionable period of the “many autisms” will be seen not as a time when we advanced our understanding of the condition, but as a tacit admission we did not possess the first clue of what we were dealing with.

Friday, August 15, 2008

Fountain of Information

My son has taught me about autism—certainly not the other way around. Why do researchers assume they understand their topic better than the subjects of their investigation?

Tuesday, August 12, 2008

The Distinguished Professor of Philosophy

I could spend the remainder of my days railing against that absurdity known as modern academic philosophy. Let me save us all some time and concentrate instead on one of its more fatuous examples—Professor A. C. Grayling.

I first stumbled upon Professor Grayling while he was still hacking out a career at the expense of a great man:

Once one has sifted his texts and has ceased to be dazzled by the brilliance of metaphor and the poetical quality, one finds much less argument, and very much less definiteness in the crucial conceptions, than is expected in and demanded from philosophical enquiry. This is disappointing.

I hold little hope for the present age, but I trust history will forever enjoy the irony of that assured lecture—Professor A. C. Grayling passing eternal judgment upon Ludwig Wittgenstein.

And from this noble launching pad, Professor Grayling has embarked upon a two-decade quest to define the very attributes of the word philosopher for my generation: university chairs, societal fellowships, a trenchant volume or two each year, a pleasant abode or so in the country, good food and good wine—lots and lots of good food and good wine. Do not get me wrong—it is not that Professor Grayling has been renegade in this particular form of philosophical pursuit. Far from it—there are literally throngs just like him scratching out a similar existence in all the collegial wings. But Professor Grayling has established himself at the forefront of this knowledgeable horde, primarily through means of a considerable marketing talent. For not only has Professor Grayling proven remarkably successful in bringing his message to the masses, he has indeed brought the very essence of himself to the masses, and has thereby convinced a weekend breakfast audience that the trappings of a philosophy professor’s life constitute the good life of modern perspicacity. From editorial boards to off-Broadway theater, Professor Grayling has rubbed a hair-draped shoulder against nearly every intelligentsia-favored artifact from this all too leisurely age, and the Sunday supplement public has eaten it up. Ask anyone in the know: Professor A. C. Grayling has garnered quite the following.

Well, of course he has garnered a following.

And the definiteness in crucial conception propping this mass appeal? The brilliance of metaphor and poetical qualities tugging at the heartstrings of his admiring audience? Let us sift through Professor Grayling’s dazzling arguments on the subject of death:

The fundamental question is how to deal with others’ deaths. We grieve the loss of an element in what made our world meaningful. There is an unavoidable process of healing—of making whole—to be endured, marked in many societies by formal periods of mourning, between one and three years long. But the world is never again entire after bereavement. We do not get over losses; we merely learn to live with them.

But there is a great consolation. Two facts—that the dead once lived; and that one loved them and mourned their loss—are inexpungeably part of the world’s history. So the presence of those who lived can never be removed from time, which is to say that there is a kind of eternity after all.

I admit freely to my bias: I do not belong to the Sunday Times intelligentsia, I am not one of those who are in the know, and I am not a member of Professor Grayling’s admiring crowd. For me this excerpt, along with all the rest, is pabulum I could forgive only coming from a pre-pubescent child; how am I to tolerate it off the pen of a man trumpeting his abilities to think for himself? Elsewhere in his remarks upon Wittgenstein, Professor Grayling holds forth that Wittgenstein has the distinction of being the last of a breed—history’s final example of a non-academically trained philosopher. I cannot say for certain whether this assertion of Professor Grayling’s might indeed be true, but if it is, I would note it also marks the end of an entire era—the end of all that has been creative, useful and eye-opening in the realm of philosophical thought. For there has never been, and there never will be, a true philosopher of the academic kind. When I consider the example of Wittgenstein, and the others much like him—such as Thoreau, Schopenhauer, Kierkegaard and Nietzsche—I realize a truism is at work here that would be a danger to overlook: a philosopher for the ages cannot possibly be the philosopher of his day. And, of course, vice versa.

Listen. I am just a simple man from Indiana. I cannot distinguish the good life from a good swig of beer. I have not the slightest idea what it takes to be a philosopher. But I do know exactly what it takes not to be a philosopher, and if I could just get Professor A. C. Grayling’s fat ass up on a pedestal, I could put it on display for everyone to see.

Oh, wait—he has already done it for me.

Sunday, August 10, 2008

Two Sides of an Expressive Coin

Which came first for humans: 1) language, or 2) temporal, spatial, and other conceptual patterns?

Try imagining one without the other.

Thursday, August 7, 2008

Fractions Are Hard

I see that the autism research community has vigorously taken up my challenge to publish an article in which the list of authors exceeds the length of the article’s content. The latest attempt, although not quite adequate, is nonetheless impressive: Autism symptoms in Attention-Deficit/Hyperactivity Disorder: A Familial trait which Correlates with Conduct, Oppositional Defiant, Language and Motor Disorders (A. Mulligan, R. J. L. Anney, M. O’Regan, W. Chen, L. Butler, M. Fitzgerald, J. Buitelaar, H. Steinhausen, A. Rothenberger, R. Minderaa, J. Nijmeijer, P. J. Hoekstra, R. D. Oades, H. Roeyers, C. Buschgens, H. Christiansen, B. Franke, I. Gabriels, C. Hartman, J. Kuntsi, R. Marco, S. Meidad, U. Mueller, L. Psychogiou, N. Rommelse, M. Thompson, H. Uebel, T. Banaschewski, R. Ebstein, J. Eisenberg, I. Manor, A. Miranda, F. Mulas, J. Sergeant, E. Sonuga-Barke, P. Asherson, S. V. Faraone, M. Gill, 2008).

However, as an outsider, I am perplexed about one particular aspect of this practice. In the reckoning of publishing credit—the kind that can be cashed in for tenure, grants, editorial board appointments, offices with a nice view and so forth (the crucial matters in the field of autism research)—does an author get one full credit for having his or her name attached to such a lengthy list, or does the publishing credit get divided throughout the list’s members? Because if the latter, I am having a hard time understanding how 1/38th of a publishing credit can go very far. Speaking for myself, if I had to participate in thirty-eight such enterprises to earn one full credit, I might find it easier just to do some original work and write it up on my own.

Or am I being too naïve?

Monday, August 4, 2008

Autism and Disability

The statement that autism is a disability can stir controversy, all the more so because the grammar of the word disability is itself imprecise. That imprecision notwithstanding, let me offer two observations I think shed some light on the subject:

  • There are many autistic individuals who are not, have not been, and will not ever be disabled.
  • There are many autistic individuals who are currently, have been in the past, and/or will be in the future disabled.

Those two assertions suggest that discussing the concept of autism alongside the concept of disability makes perfectly good sense, but that predicating the concept of autism with the concept of disability is far less precise—if not downright fallacious. We do not know what produces disability in certain autistic individuals—attributing that outcome to autism itself is only an assumption, and in my eyes, not a very good one.

Friday, August 1, 2008

Bicycle Built for Three

Autism, schizophrenia and bipolar disorder (manic depression) are listed separately in the Diagnostic and Statistical Manual of Mental Disorders, but a discerning eye will recognize the commonality among these three conditions:

  • Each is classified as a mental illness without any evidence of sickness.
  • Each is diagnosed through peripheral characteristics instead of direct etiology.
  • Each manifests as a cognitive perception distinct from the species-driven norm.

These three conditions are eerily similar, and for a misunderstanding humanity, they ride past as triplets in crime.

Tuesday, July 29, 2008

Dawkins, Hitchens, Harris, and Dennett

Short-sighted science, intellectual fatuousness, a simpleton’s atheism—who needs to believe in a Hell’s afterlife when we have the tortures themselves right here?

Sunday, July 27, 2008

Choosing Up Sides

My stand is always with the autistic individual, never with the relative of any autistic individual.

That is not to say a parent of an autistic child cannot win me over. The ones who argue doggedly for their child’s abilities, who fight and scrap for environments always affirmative, who celebrate autism along with every other feature—my stand is also with them, not because of their status as parent, but because they have chosen the constructive course.

And those parents who make themselves heard by disparaging their autistic child? Who clamor ad nauseum for their child’s limitations, and insist limitations alone be heard? I oppose these with bitter force, not because I have ignored their standing as parent, but because I am resolved to human decency—it is there I draw the line.

Thursday, July 24, 2008

The Prerequisites of Language

Nearly all animal species have sufficient biological equipment for producing an abstract language—they can make sounds, they can gesture, they can rub against one another. With access to some mud, most organisms could write things down.

Almost any physical artifact can serve the purpose of conveying a language, as we humans have by now so ably demonstrated. What the other animals lack is what language represents. For what good is an abstract language when one’s entire world is already present, always in the here and now?

And we humans too—we had nothing to talk about until just so recently.

Tuesday, July 22, 2008

Evidence Against the Flynn Effect

In this era of Google maps and GPS systems, it seems most delivery drivers still need to call and ask for directions.

Sunday, July 20, 2008

Control Group

For clinical trials of various autism treatment options, why does no researcher ever think to make use of the most obvious control group—the autistic children who have escaped diagnosis? It should be a simple matter to round them up from the institutions, graveyards and other dumping grounds of irretrievably broken lives.

Or is it not that easy to track them down?

Thursday, July 17, 2008

The Two Forms of Human Logic

Humanity currently perceives its world with the aid of two distinct forms of logic.

The first logic derives from the evolutionary inheritance of our animal past. Its goal is survival and procreation of the species, and its impact has been to conceptualize the sensory world into food sources, danger, sex, shelter, and so on. Darwin’s genius was to expose the structure of this biological logic and to lay out its influence as seen from the outside and experienced from within. Non-autistics are born naturally into this form of logic—it is the other logic they must learn to acquire.

The second logic has been extremely recent in its genesis. Its goal remains unclear, but its impact has been to conceptualize the sensory world into pattern, shape, space, time, and the like. The genius of the Greeks and the fruit of the Renaissance was to expose the structure of this non-biological logic and to lay out its influence as seen from the outside and experienced from within. Autistics are born naturally into this form of logic—it is the other logic they must learn to acquire.

Neither form of logic by itself appears to have transcendent power, but combined, they have rapidly transformed a species and its world. Combined, they have transfigured individuals.

An Aside to the Two Forms of Human Logic

Wittgenstein would have employed the word grammar instead of the word logic, and in many ways, grammar is the more illuminating choice.

Wednesday, July 16, 2008

The Printing Press and the Internet

Each quantum leap in language dissemination breaks the stranglehold of a fossilized institution—the Church formerly, academia now.

Medieval and mediocrity share a similar root.

Sunday, July 13, 2008


Someone should take the time to perform an MRI study on personal computers—say Intel-based machines versus some Macs. Each comparison group could be resonance photographed while performing the same task, for instance the monthly payroll. I suspect there will be some differences.

Since the Intel-based machines are in more widespread use than the Macs, their images could be taken as the healthy ones. The areas of highest concentrated glow might be described as the presumptive location for a monthly payroll module, and the authors of the study could claim they have greatly advanced our insight into the concept of computing. By comparison, the electronic flows of the Macs could be described as disordered, and various treatment plans—such as battery boost, a well-placed bobby pin, or just a good shake—might be suggested.

What today’s brain imaging studies show most clearly is our own muddled thinking.

Friday, July 11, 2008

Phantom of the Pinker

Is our tool use mental? Do we toil with hammers, wrenches, knives and awls because we thought them up at one point? Is there a brain module for each tool—a saw module, a lathe module, etc.—or is there instead an all-purpose module for tool use?

Or does this talk of tool modules sound silly?

Then why do we treat language differently? Humans now have many different forms of language, but each form manifests as a physical and immediate fact—sound vibrations, movements of fingers, marks on a page, flips in computer memory, etc. I can imagine a spoken conversation, but only because there have been such conversations, they have actually existed in physical and immediate reality—just like hammers, wrenches and saws.

Mentalizing language distracts us from what language actually is. And dreaming up language modules for the brain sounds like the work of someone confused by what exists right before his eyes.

Tuesday, July 8, 2008

The Crucial Importance of Early Diagnosis

It seems the world is now awash in paeans to the importance of early diagnosis of autism in young children, and I guess looking back on the experience with my own son, I would have to go along. In our case, early recognition kept our family from accidentally stumbling into such traps as ABA, Risperdal, or chelation therapy, and it is hard to imagine what horrors we might be dealing with around here had we inadvertently gone down one of those roads.

Sunday, July 6, 2008

Autistic Children Grow Up

Harold Doherty has written recently about one of the irrepressible aspects of his autism reality—namely that autistic children grow up—and I would like to take a moment to wholly concur with that observation.

Autistic children do indeed grow up. In the case of my son, I find him fast approaching a milestone I can only describe as filling me with a type of very real fear. You see, he is nearly 48 inches tall now, and around these parts that means he can soon ride on the biggest rollercoasters—accompanied by an adult, I am sorry to say. Up to this point, I have managed to control this rollercoaster riding behavior through the use of many carefully arranged discrete trials, dutifully noting all the height restriction signs around the amusement park and accurately pointing out the clear difference between the top of Brian’s head and the bottom of the delimiting line. But that clear difference is not so clear anymore, and besides, my rapidly aging arm can only take so many more tugs upon its fast lengthening sleeve, so I am afraid I must face up to the obvious fact—that inevitable day is about to arrive. Then up up up we must ruthlessly go, and if I dare to look I know I will see Brian there on the seat beside me—hands clasped, rubbing and flapping, his laughs much like a kind of madness, the shrieks a little too loud, his giggles far too inappropriate (well, how can they be described as anything but inappropriate when I myself have a death grip on the restraining bar and mouth formed into the shape of a giant O?)—and then down down down, rushing, twisting and bashing about, the perfect metaphor for those chaotic experiences we as parents of autistic children know all too well. And then the briefest interlude near the very end, the tiniest respite before the terror-inducing words are spoken once more—“I want to go again.”

But perhaps Mr. Doherty would rather I wax more serious on his chosen topic, for indeed, as he has rightly noted, autistic children do grow up. So let me speak of another reality my son will soon be facing, that of entering school—entering school, that is, if my wife and I can ever come to a decision about where that experience should best be had. On the one hand, the possibilities seem far too numerous—public school, private school, home school, special needs, Montessori—but on the other hand the choices seem not nearly adequate enough, for what educational setting can possibly meet our son’s many divergent needs? What school system is going to remain flexible enough to accept him pacing the halls when the urge so urgently strikes, and also allow him to drill deeply into a set of encyclopedias when a particular topic has caught his fancy? And how to avoid the bullies? And how to encourage the making of friends? Heck, how to trick him into eating a cafeteria lunch, considering how stark his diet currently is? The challenges are certainly going to be many. The potential problems will undoubtedly be troubling. The world is a daunting place, my wife and I well know it, and launching our son onto life’s expansive path fills us both with a kind of awe and dread. But then we recall how our young traveler is of the category autistic—and thus how his potential is endlessly surprising and creative—and then we relax and smile just a little, for really, how can we ask for anything more?

But I suspect that vision will remain much too short-sighted for Mr. Doherty’s taste, who having perceived that autistic children grow up, has contemplated the consequences all the way to their very end—all the way to the doors of institutions. Well, why not? We too have institutions hereabouts, some of them particularly well suited for handling the troubles of a child such as my own. After all, what else can I be expected to do with a son endlessly obsessed with ceiling fans, a son constantly fussing about with knobs and buttons and switches? Unless experts recommend otherwise and insist he be committed to a more specialized place of residence—such as M.I.T.—then I will have to make the call to Purdue, I think, it is there I will have him placed. (My wife, however, focused on the issues of Brian’s perfect pitch and rhythmic singing, suggests the wards of Juilliard would make a better home.) It is possible I am being much too pessimistic, of course—perhaps Brian will surprise us all and defy our most carefully crafted expectations, striving first to do some simple, honest work and only later discovering his more expanded calling (such as Ms. Dawson and Ms. Harp seem to have so ably done). What fills me with the greatest emotion, however—and here I think Mr. Doherty could hasten to agree—is thinking about what might happen if I were not there to help Brian make the most difficult decisions. What would happen if, God forbid, both I and my wife were irretrievably gone? What if Brian were somehow forced to rely upon his own unique perspective—along with whatever meager tools his parents had managed to instill? What if he had to decide for himself what indelible marks to cast upon his world? What if he had to go forth as an individual? Have I fully considered those possibilities, have I contemplated the consequences all the way to their very end?

Yes, Mr. Doherty, autistic children grow up—what an inspiring thought that is! What incredible terror, awe and joy!

Friday, July 4, 2008

Fresh Air

When nearly everyone has become lost examining the details on the barks of all the trees, the one who maps the forest performs a great and thankless task. And the one who charts the country surrounding the forest—he does an even greater and still more thankless service. And the one who suggests that insight is to be gained in the forest and in the country—and not on the barks of all the trees—he gets to play the role of today’s pariah, and tomorrow’s savior.

As always, when the problem has become intractable, the way out is to examine the context. Forever digging deeper into details only clouds the landscape with dust.

Wednesday, July 2, 2008


The modern scientist, with a hint of superiority, will often extol the steady and methodical pace of scientific progress. But consider the work of Newton, Darwin and Einstein—what was steady and methodical about that?

The systematic advance of science is the smell of science gone bad.

Sunday, June 29, 2008

Abusing the Source – An Example

For each person inspired by Wittgenstein, there are a thousand academicians explicating Wittgenstein.

Abusing the Source – The Program

public class AbusingTheSource {

public static void main(String args[]) {

// inheritance from our past
java.util.Random generator
= new java.util.Random();
String[] visionaries = {"Wittgenstein",
"Beethoven", "Kierkegaard", "Einstein",
"Nietzsche"}; // etc.

// current reality
String[] academicians
= newBatchOfAcademicians();
String person = newPerson();
boolean isGriswoldHeard = false;

// the foreseeable future
do {

// the source of academic wealth
String visionary = visionaries[

// demonstrate one-to-one correspondence
// of something to nothing
System.out.println("\n" + person +
" inspired by " + visionary);
for (String academician: academicians) {
System.out.println(academician +
" explicating " + visionary);

// prepare for tommorow
academicians = newBatchOfAcademicians();
person = newPerson();
if (isGriswoldHeard) {
academicians =
if (academicians.length < 1000) {
isGriswoldHeard = true;

} while (academicians.length > 999);


private static String[]
newBatchOfAcademicians() {

// factory work -- suitable for a machine
String[] newBatch = new String[1000];
for (int i = 0; i < 1000; i++) {
newBatch[i] = "Academician " + (i + 1);

return newBatch;


private static String newPerson() {

// simplicity personified
return "Person";


private static String[] removeOneAcademician(
String[] academicians) {

// an arduous task
// (TO DO: stress test the hardware)
String[] aMoreQuietBatch =
new String[academicians.length - 1];
for (int i = 0;
i < aMoreQuietBatch.length; i++) {
aMoreQuietBatch[i] = academicians[i];

return aMoreQuietBatch;



Abusing the Source – The Precursor

For each person inspired by Christ, there are a thousand theologians explicating Christ.

Sunday, June 22, 2008

Early Warning Signs

The following two assumptions enjoy widespread acceptance within the academic and research communities:

  • Human intelligence is centered within the structures and dynamics of the human brain.
  • The unusual early behaviors of autistic children (e.g., lining up toys, echolalia, spinning and twirling) are an indication of a neurological disorder.

But I would have us consider an alternative view to both intelligence and those early warning signs of autism, for not only are the above assumptions false, their negations directly support each other.

It has become tantamount to dogma within the scientific community that intelligence is centered within the human brain—I am certain I will not make the slightest dent in that conviction anytime soon. Armed with ever more sophisticated brain imaging technology, and fortressed by countless experiments attempting to match neuronal activation to a plethora of human tasks, the world of cognitive research expects soon to discover the exact locations of logic, language and the arts, and hopes not long after to describe the intellectual mechanics pulling together this tangle of synapses, cortexes, and brain matter plasticity. The pictures are indeed vibrant, and the metrics are certainly bountiful; but conceptually, I am convinced all is not well.

Consider the hallmark features of human intelligence—pattern recognition, sophisticated visual-spatial capacity, conceptual logic, mathematical and musical skill, the pragmatic use of abstract language. Does it strike no scientist as even the slightest bit unnerving that these features first made their appearance only quite recently in human history? Almost nothing of what we take and measure for human intelligence today can be found in the behaviors of Homo sapiens from just thirty to fifty thousand years ago, and it is only the slightest hint of such ability that begins to emerge near the dawn of recorded time. We gaze with anticipation into our fMRIs and calculate hopefully our degrees of significance, but we forget we peer not into the brains of just our contemporaries, but also into the brains of our more distant ancestors. And so we must ask—why the sudden and late emergence of all this cranial intelligence for which we so fervently delve? Why could we not have built our modern civilizations way back then, when these same brain structures and capacities already existed? Why should we have tarried until just so recently?

And of course there is that little detail known as the Flynn effect, the observation that intelligence scores have been increasing at roughly three IQ points per decade. What a puzzler that discovery must be for any neuronal-based model of human intelligence, and no wonder those who have staked considerable reputations on such models—including Professor Flynn himself—seem so willing to explain this phenomenon away as mostly a twentieth-century anomaly soon to dissipate (demonstrating again that today’s scientist is far more ready to embrace an unlikely coincidence than question the foundations of an established career).

Me, I would much rather be bold. I would rather negate that first assumption, and place the locus of human intelligence firmly outside the human skull.

If instead of sourcing intelligence within the human brain, we distribute it instead throughout the environmental structures we humans have been building—and will continue to build—all around us, we arrive at a tangible locale for intelligence more directly fitting what we already know about the subject. Tens of thousands of years ago, the human environment was almost entirely biological, and so were the forms of our intellect. Food acquisition, shelter, warmth, sex, avoidance of deadly enemies—these were the sum cognitive focus for a species whose universe extended no farther than that of the boundaries of the tribe. No clocks, no yardsticks, no musical notes, no truth functions to decide yea and nay—not on the hunter-gatherer’s grassy plain. Suddenly we no longer live on that plain, our locale has dramatically shifted—growing both larger and more detailed in ways we have scarcely begun to conceive—and it is no coincidence that as our setting has so dramatically transformed, so have the forms and degrees of our intelligence. With our surroundings now filled to the brim with structure, pattern, complexity, repetition, embodied conceptualization—much of it decidedly non-biological—our abilities to navigate and master this strange new world are exactly the skill sets we measure when we place a human being in front of that little booklet called an IQ test.

It is not our brain that has been changing—there has not passed nearly enough evolutionary time. The human brain, plasticity and all, has grown not one iota smarter, for the human brain is not the seat of human intelligence. It is our physical environment—that is the tangible object that has grown so remarkably more brilliant. The form of the physical human surroundings—there can be found the long sought-after location of our increasing cognitive skill.

And the Flynn effect? It resolves into little more than a triviality under this new paradigm of intelligence, for it becomes simply another measure of the increasing amount of pattern and complexity we humans have been embodying into our environment year after year, along with the confirmation that each generation can absorb this new information and its strange, mostly non-biological form. Each generation finds itself born into a set of surroundings more complex, more detailed, more rapid than those perceived by the previous generations, and by necessity learns to navigate and to master, and lo and behold finds itself scoring better than all its progenitors on any test designed to capture intelligence. The Flynn effect is not a twentieth century coincidence; it is not produced by better nutrition, selective breeding or a socially-driven multiplier effect (and Professor Flynn, it most certainly is not produced by proximity to the local college). The Flynn effect has been with us from the time of the great leap forward, and assuming we can learn to embrace this phenomenon instead of so glibly dismiss it, the Flynn effect will remain with us, and sustain us, for a considerable time to come.

But I can hear your objections already ringing in my ears: have I not placed the cart before the horse? The essential question, you say, is not that we humans have been constructing a structurally more complex environment all around us and have been learning to skillfully live within it—anyone can attest to that—the essential question is what produced this remarkable transformation? Are not the splendors of modern civilization the result of human intelligence, the unquestionable evidence for its cranial existence?

Having absorbed enough logic from our current environment to know I would not want to be accused of placing a cart before a horse, let me state unequivocally it has not been pre-existing neuronal intelligence that has prompted the massive environmental transformation. If it is a cause we are seeking, then we must direct our attention outside intelligence and search for the catalyst someplace else. So let me turn this discussion to what must seem to be a completely different topic—the early warning signs of autism.

In the autism research community, the early behaviors of autistic children have been branded with the most miserable of reputations—I am certain I will not prompt the slightest halt in that practice anytime soon. Emblazoned with such adjectives as obsessive, fruitless, self-stimulating and meaningless, autistic behaviors have been cast into one of the psychiatric community’s most profitable and prolific targets, an abundantly fertile field for the development of eradicating therapies, minimalizing drugs, and the not-so-occasional slur. The grants are indeed impressive, and the size of the research teams has certainly grown massive; but conceptually, I am convinced all is not well.

Consider the form of those early autistic behaviors—lining up objects, repeating passages, fascination with circles, letters and numbers, turning on and off light switches, rocking, humming and all the rest. Does it strike no researcher as even the slightest bit unnerving that these behaviors are in no way random or chaotic, but instead cluster around the concepts of pattern, repetition, symmetry and simple logic? Tens of thousands of years ago such concepts had scarcely begun to scratch the human surface, and it was only quite recently this species dared to embrace such features as the ones making us distinctly noble. And yet we confidently pronounce the autistic behaviors as aberrant, we calculate with smug certainty their substantial difference from the norm, all the while forgetting how we as Homo sapiens have been suddenly jolted into being nothing like our normal animal selves. And so we must ask—why this contemptuous dismissal of behaviors we have not yet begun to fathom? Why, given our lingering uncertainty about the exact location and genesis of human intelligence, are we so intent on demonizing the most spontaneous occurrence of intellect’s fundamental form?

And of course there is that little detail known as the positive outcome, the growing evidence that many, if not most, autistic children can achieve a full potential, display considerable signs of ability, and mature to lives of admirable productivity—all without the benefit of psychotropic drugs, biomedical treatments or behavior-altering therapies (and some—can I even dare to say this—achieving such outcomes despite such interventions). What a puzzler those results must be for any disorder-based model of autism, and no wonder those who have staked considerable livelihoods on such models—almost the entire research community, it would appear—seem hurried to explain such outcomes away as trivial by-products, insignificant anomalies, outliers to be ignored (demonstrating again that today’s scientist stands far more ready to slander an experimental subject than jeopardize the source of any funding).

Me, I would much rather be bold. I would rather negate that second assumption, and place the early behaviors of autistic children firmly outside the category of neurological disorder.

If instead of classifying autism as a type of cognitive damage, we consider it instead as an alternative and valid form of cognitive perception, we arrive at a catalyst for human intelligence that does not require the magic of sudden neurological or genetic transformation. What better means to jolt a species from its strictly biological gaze than to have placed among it members with a decidedly different perspective, so that in addition to concerns of food, sex and enemies, there appear now those creeping influences of symmetry, pattern and repetition. The behaviors of autistic children are not learned, they are not the result of species imitation—they arise from spontaneous need to make sense of experience ungrounded by biological form. And thus the autistic perceptions, and the behaviors resulting from them, they open windows onto concepts mankind has never seen before, they open windows onto the myriad examples of non-biological structure. This collision of perspectives has been many times awkward—no one would argue that. The difficulty for autistic individuals has been many times onerous—that cannot be denied. But apologies are not required, and neither are the slanders; the melded results have been outrageously prodigious, or had you failed to notice? For it has been little more than a matter of poof … and now suddenly all of us, autistic and non-autistic alike, we have disappeared from the hunter-gatherer’s grassy plain.

Autistic perception is not an affliction—that dogma has badly missed the mark. The study of autism as mental illness—it has lent not one iota of understanding to this paradoxical condition. Autism itself is the key—it is the key to our physically-constructed intelligence. Autistic perspectives—they are what has prompted the environment’s massive structural change, there can be found the long sought-for genesis of our increasing cognitive skill.

And the positive outcome? It resolves into little more than expectation under this new paradigm of autism, for it resolves into simple confirmation that autistic influences have been very long among us. Before this species “discovered” mental illness, before it began squelching unique perspectives with medications and endless sessions, the vast majority of autistic individuals matured quietly and productively among us, etching their strangely patterned perceptions into mankind’s fast-transfiguring path. As each generation has found itself born into a set of surroundings embodying more and more of these patterned perceptions—and by necessity has been learning to navigate and to master—lo and behold humanity has found itself fast abandoning the strictures of evolutionary past. Autism is not a twentieth century emergence; it is not the product of industrial toxins, genetic defects or brain dysfunctions to be drugged and therapied away. Autism has been with us from the time of the great leap forward, and assuming we can learn to embrace this phenomenon instead of so hastily dismiss it, the benefits of autism will remain with us, and sustain us, for a considerable time to come.

But I know all this must sound so absurdly impossible, and really, I know no one is currently listening.

So for now I will leave the researchers to all their resonance machinery—I understand how giddy they must be with excitement and what a pretty penny someone has paid for the privilege, so I would not dream of distracting them from these momentary pursuits. But after all the picture taking is over, after all the statistical packages have been run, after all the same conclusions have been written to the same assumption-driven experiments, if perhaps a scientist or two should find themselves suddenly grown weary, perhaps a little discouraged, should they think maybe a brief respite or a change of scenery might do them a little good, I would be happy to guide them to the room one over, the one where their experimental subjects bide their time by playing on the floor—lining up toys, echoing the passages they hear and know, spinning and twirling. And I promise I will be gentle with my suggestion, in fact I will just barely whisper it into their ear, that perhaps this is what they have been looking for all along, right here before their eyes—these early warning signs of a dawning human intelligence.