It is not for idle purpose I shine light on the community of autism research journals. The attitudes backing practices of piled-on authorship, sycophantic peer review, citation back-scratching and editorial board nepotism are the very antithesis of science. (And I have yet to mention the harm being done to autistic individuals.)
Sunday, October 26, 2008
Human language has shot forth from two distinct roots, from two different sources of fundamental influence.
The first source has been the temporal, spatial and logistical pattern that constitutes the order of the surrounding, non-biological world. This influence is seen most clearly in language’s underlying structure: object and concept, noun and verb, temporal tenses, spatial adjectives, and all manner of nuanced prepositional form. This particular aspect of language did not arise from humanity’s biological and evolutionary past, but instead came from out of the struggles of an unusual interloper; it came from the autistic perceptions and cognitions that have gained foothold within the human population. Autistic individuals, who by definition are not cognitively grounded by the usual species-aware perceptions, have created a cognitive grounding instead out of the patterns and symmetries to be found in the surrounding environment. But as biological creatures themselves, and needing to convey this unique form of perception both to themselves and to others, autistics have uncovered also the only biologically immediate means by which a non-biologically immediate perception can be represented—they have uncovered the essential accompaniment called language.
However, as has happened on many other occasions of autistic discovery and introduction, language was quickly adopted, transformed and widely spread by the more plentiful non-autistic members of the human population, and thus human language soon acquired a significant second root. This influence appears most noticeably in the core vocabularies of the world’s languages, which are dominated by words, phrases and metaphors derived out of humanity’s evolutionary and animal past, revealing that as the majority of humans were introduced to the possibilities of language, they quickly adapted its content (autistics might say they corrupted its content) to reflect those features of existence more natural and essential to them—the businesses of eating, excreting, tribalizing and procreating. Furthermore, the core content of human language is also that aspect most often accompanied by non-verbal cues and subtext, suggesting that not that long ago mankind conducted its survival and procreative business without the aid of abstract language, conducted it entirely through the medium of immediate gesture and inter-social behavior (much as the other animals do). Abstract language was the latecomer, and upon its introduction it was layered on top of an already extant form of biological communication, and both layers now play a crucial role in the immediate conversations of today. (And of course it is worth noting that it is the non-verbal, instinctively perceived aspect of human communication that autistic individuals have the most difficulty learning and mastering.)
In the early twenty-first century, human language remains a blend of its two sources of influence, and appears to resist most pulls towards becoming an homogenized mixture. The evolutionary, biological source of influence continues to hold prominence as the most frequently employed aspect of human language—from small talk to international diplomacy—and thus continues to serve its purpose of being the linguistic glue that helps hold the species together. But an examination of language’s accelerating number of changes and additions, especially those introduced over the last several hundred years, reveals how the autistic root of language has become increasingly more influential over time, threatening to regain once more what might be described as its birthright. The periodic table, representational painting, box scores, blueprints—one does not have to look far to recognize how the non-biological, non-evolutionary aspect of human language has been rapidly transforming the behavior of the species and reorganizing the manner in which it communicates. And consider the education of children—the majority of whom can pick up the core, biological aspects of language by the time they are five—but who require with each new generation more and more time, and a much greater variety of instructional technique, to absorb just a fraction of the new language and new communicative structure that has been added in recent years.
There is much that can be learned about the current status of our own humanity by teasing out the various structures and contents of the expanding forms of our human language, an analysis that becomes fruitful only after recognizing that human language did not spring from a single source alone.
Wednesday, October 22, 2008
Professor Simon Baron-Cohen and his colleagues have written about today’s “lost generation” of adults who are currently being diagnosed, long after their childhood has ended, with various forms of autism (usually Asperger Syndrome):
But what of the generation who were born before 1980, who may have had Asperger Syndrome but for whom there was no diagnosis available? No specialist clinical teams, not even the concept of Asperger Syndrome. How did they fare? The answer is that they were overlooked, and struggled through their school years. And the reason we run a clinic for the very late diagnosis of Asperger Syndrome is because these are the lost generation: those who today would receive their diagnosis by 6 or 8 years old, if they were a 21st century child. They come to our clinic in young adulthood or even middle age, and they tell us a now-familiar story.
All through their school years they had trouble making friends or fitting in. Many were bullied by the other children, both physically and verbally. Many felt, in Claire Sainsbury’s chilling words, like “an alien in the playground”. (This is the title of her excellent book). The lucky ones managed to stay in school long enough to get their SATs, and some got to university. But not without feeling their teens were an uphill struggle. By young adulthood many had suffered clinical depression and even felt suicidal. All because their underlying condition of Asperger Syndrome had gone unrecognized and therefore unsupported. Some of them had enjoyed the closeness of an intimate relationship only for this to break down. Some had found employment only for them to run into problems in the work place through not understanding what the employer and other staff might expect of them, or through getting into conflict, or being passed over for promotion because of their lack of team skills.
But only a moment’s reflection reveals something grossly amiss in Baron-Cohen’s description. The above paragraphs imply there has been only one lost generation of autistic adults—the generation born before 1980—but how can that possibly be? What about the previous generation, say the ones who were born in the 1940s and 1950s? Or those born in the early 1900s? Or what about those born in the 1800s, and even much earlier than that? Does Baron-Cohen mean to suggest autism descended en masse upon humanity sometime in the late twentieth century, and are we meant to understand there were no autistic adults living among us until a few began showing up outside the door of his clinic, desperate for relief (so he says) from the wretched circumstances he and his colleagues would have thrust upon them?
This generation of autistic adults Baron-Cohen is attempting to describe is only the latest in an impressively long line of autistic generations, reaching far back into humanity’s ancient past. That autism has gone unrecognized for so long suggests exactly the opposite of what Baron-Cohen is trying to say—it suggests there has been nothing lost, or even wretched, about any of these prior generations. Far from being lost, the current generation of autistic adults is indeed the first to be found, found living quietly and productively among us, as autistic generations always have. But if Baron-Cohen and his colleagues must insist on hunting for those who are supposedly gone astray, might I suggest they expand their search beyond their clinic door. For out in nearly every street can be found a generation of autistic adults much larger than even Baron-Cohen has managed to conceive, a generation doing so well it would never think to bother with his self-deluded clinic.
As I have written elsewhere, autism has had a significant presence within the human population for a very long time—a presence mostly silent, but not without consequence.
Sunday, October 19, 2008
Some recent studies have suggested the Flynn effect might soon be ending (at least in the Scandinavian countries), and several authors, including Professor Flynn himself, have latched onto this possibility like it were a lifeline being tossed to a drowning man. Those who have regarded the Flynn effect as mostly a twentieth century phenomenon—an historical anomaly, as it were—are experiencing near palpable relief at this hint the anomaly might soon be going away. Such a disappearance would end their fright, would end their gnawing fear that mankind is indeed growing ruthlessly more intelligent in a damn near inexplicable way.
But some of us are not so easily frightened. Those who have contemplated the entire history of mankind—from its animal-like existence not that long ago, through the sudden sprouting of complex civilizations beginning around six thousand years ago, to the franticly paced modern efforts to transform nearly every square inch of this entire planet (and those who have seen the imprint of the Flynn effect throughout that blazing history)—for us, any suggestion the Flynn effect might soon be ending (and coincidentally just now, right at the very moment of its discovery)—well, how are we to choke back our reaction without offending those who have become so terribly frightened, for really such suggestions are little more than laughable.
To be sure, over the past fifty thousand years the Flynn effect has certainly gone through some surges and ebbs. To focus on Western Civilization alone, the era of Ancient Greece, along with its enduring aftermath, would undoubtedly have been one of those periods during which the Flynn effect ran at peak. Just one perusal through the physical constructions of that age—the buildings, the written mathematics, the crafted and portrayed arts—and one sees an embodiment of pattern and form running far beyond anything mankind had ever experienced before. That embodiment, suffused throughout the populace and down through the generations, showered its re-creating and foundational intelligence across the Roman era and well into the first millennium A.D. It was not until the stretch of the Dark and Middle Ages that we discern a slowing down of this ordered construction—and only then a slowing down of the Flynn effect—but not a complete halting, mind you, Western Civilization at 1500 A.D. still was more rapid and complex than Western Civilization at 500 A.D. And elsewhere of course—in Byzantium, India and China—we find yet more examples of the Flynn effect crescendoing into various bursts of sudden and local bloom.
Beginning with the Renaissance, the pace of this structural change embodied into the human environment resumed once more its rapid acceleration, and over the last five hundred years, and across all manner of civilization, man’s temporal, spatial, and non-biological abilities and patterned formulations have been increasing at a nearly mind-breaking speed. The I.Q. tests of the twentieth century have captured only the most recent period of this ongoing phenomenon; if there had been intelligence tests available in all the previous centuries, the Flynn effect could have been discovered long before now. Professor Flynn has not stumbled onto anything new. He has stumbled only onto the most recent residue of a process that has been profoundly reshaping the human landscape from the time of the great leap forward, and if the impressive Flynn effect statistics from the twentieth century are to be telling us anything at all, it is that from sheer momentum alone the Flynn effect can be expected to remain with us, and sustain us, for a considerable time to come.
But allow me to offer a moment’s respite to those who are so terribly frightened. Let us consider for speculation’s sake what might happen if indeed the Flynn effect has ended. What this implies of course is that in theory any one of us would be able to score as equally well on any future-offered intelligence test (say those being sat for two hundred years hence) as we might tally on any of the currently offered standardized forms. This feat, we realize in retrospect, would not have been possible for those poor souls who lived in the early nineteenth century. Try imagining a man from the early 1800s nabbed suddenly by the scruff of his neck, hustled forward a couple hundred years or so, whisked by plane, taxi and elevator to a brightly lit, sharply cornered examination room, placed before a typed-out pamphlet of the strangest looking shapes and most oddly worded phrases—pen and stopwatch waiting impatiently there on the table beside him. What answers do we anticipate from such a man? What brilliance might we soon be expecting to hear, beyond, that is, his repeated stammering, “But dear sir, what exactly do you propose I do?”
Respite over. For honestly, how can we believe, frightened or not, that we ourselves would escape a similar fate? Try imagining yourself now, suddenly bolted forward towards the twenty-third century, hastened to an examination hall by powered means you cannot begin to describe, then suddenly strapped to a contraption of all manner of knobs and wires and switches (at least, those are the only words you can think of to describe such an oddly constructed panel), and now with flashes of three-dimensional light dancing all about you, with rapid questions poured upon you in a grammar you have never considered before, and all the while accompanied by frantic demands to respond with a quick jab of finger, a flicker of eyelid once or twice, or at least a simple grunt or two. And just about the time you manage to catch your breath, just about the time you gather enough wits to offer at least one feeble attempt, the lights of the examination hall suddenly darken, the rapid stream of questions comes to a jarring halt, and from out of the walls a stentorian, twenty-third century form of a tut-tut voice announces that your time is over and that your score has failed to register, at least on any significant range. Some intellect you turned out to be.
If the Flynn effect has ended, then so has the course of human progress. To embrace this absurdity would be to misperceive what the Flynn effect has been trying to tell us; it would be to misconceive the question, what is intelligence? There are no paradoxes to be explained away from increasing intelligence scores; there is only a befuddlement from brain science dogmas that have been turning our inquiries outside in. The Flynn effect compels us to remove intelligence from out our head and place it in surroundings where it more rightly belongs, in the structured landscapes we humans have been building all around us, and will continue to build for a considerable time to come. A world increasingly more spatial, more temporal; an environment always more patterned, more frenzied—and what wonder can there be that we require newer generations to absorb each change afresh, and leave all the ancestors behind?
The Flynn effect has been shadowing the path of our human journey; it has been marking the pace of our considerable advancement, the one taking us all the way from savannah-bound primate to questing knight of a massive universe. Intelligence is not to be found in packets of scores alone: cast an eye wider, cast an eye across the twentieth century’s entire vista—from horse-drawn buggy to rocket in sky, from ground-hugging hovels to skyscrapers knifing air, from shovels and guns and axes to computers and networks and drones. Cast an eye across that entire scene, then say with conviction that the show is about to end. The sudden halt would jolt us right out of our skin; the end of the Flynn effect could only mean the death knell of all mankind. Allow me to save my fright for that possibility alone—for humanity’s darkest age indeed.
Wednesday, October 15, 2008
Recognition of non-biological pattern and symmetry is the basis of our notions of time and space, and this recognition is not biologically natural. Its first appearance on this planet occurred mere centuries ago, and there is no indication of such recognition in the cognition and behaviors of the other animal species, or in the cognition and behaviors of early humans.
Thus the word normalcy does not attach to temporal and spatial recognition. To witness a spontaneous, biological occurrence of such recognition, we would need to look more carefully at the cognition and behaviors of those children we have ironically labeled as disordered.
Sunday, October 12, 2008
If we are speaking of the age of the world as mankind currently perceives it—with all our science and mathematics and literature and art and engineering and astronomy and so on—then the creationists are certainly being far more accurate in their dating techniques than the evolutionists are. Six thousand years ago, this species had made considerable progress stepping in off the hunter-gatherer’s grassy plain, but still lived much nearer to that plain than to the current forms of modern civilization. In Mesopotamia, in Egypt, in India, in China, dazzling flowers were on the verge of bloom, transformations our scientists still have not characterized accurately, the transformations having overwhelmed our scientists’ means. To say that the world was created almost miraculously starting around six thousand years ago would not be pressing credulity any appreciable degree—not if we are regarding that world through the gaze of our own eyes.
If the scientists can take any consolation from their miscalculation, it is that only they recognize flowers must have a context in which to bloom: to deny the steadfast background of the soil is no better than to suppress wonderment at the germination of the seed. The mistake of the creationists has been to ignore the very real context upon which this planet’s miraculous transformation took place, and the mistake of the scientists has been to remain dogmatically blind to the fact an anti-evolutionary transformation occurred at all.
Wednesday, October 8, 2008
The theme binding these posts together is an unrelenting scrutiny of autism’s common wisdom.
You might think autism does not have a common wisdom, given the multitude of causation theories: genetic defects, brain abnormalities, environmental toxins, refrigerator mothers, vaccines, various combinations of all the above. You might think autism does not have a common wisdom, given the plethora of treatments: applied behavioral analysis, litanies of drugs, endless varieties of therapy, sensory integrations, biomedical concoctions, eugenics. You might think autism does not have a common wisdom, seeing as how we are as in the dark today as we were at the moment of autism’s discovery.
But autism does have a common wisdom. There is one assumption, one unquestioned belief underlying all these diverse theories and treatments, all this lack of progress. The one certitude so widely held is that autism is an indication of something gone medically wrong—an illness, a defect, a disorder, a blight mankind would be much better off without. That is the first step that altogether escapes notice. That is the initial turn in the wrong direction, the one leading right over the edge of a cliff.
In each entry here, you will find yet another attempt to shine light on that common wisdom—probing, exposing, holding the assumption up to the scrutiny of actual vision, holding the assumption up to the current status of our own humanity. With autism, we have talked ourselves into a pseudo-illness, and what a shame that must be for all, and what a lost opportunity for self understanding; for under the spell of common wisdom, we have failed to recognize how the current status of our own humanity has been built upon the back of that pseudo-illness.
I will be saying this again and again: autism’s common wisdom has been blind from its very beginning, and we will gain no understanding of this condition—and no understanding of ourselves—until we step back from the darkness of that abyss.
Sunday, October 5, 2008
Language is the use of a biologically immediate artifact to represent something that is not biologically immediate.
All organisms, including humans, exist only within biological immediacy. That which is removed in space, that which is remote in the past or future, that which is not graspable through innate biological capacity—such events cannot be directly engaged or experienced by any organism. Every sensation, every urge, every action must transpire in the biological here and now, and there is no alternative—that is the essence of biological experience. And indeed, until quite recently on this planet, that has been the limit of biological experience.
Although an organism has no means to step outside of its biological immediacy, humans have demonstrated that organisms can use the material of biological immediacy to represent something else, to represent that which is not biologically immediate. It is in this way that language serves as a bridge to previously unknown and always unreachable realms: using language, an organism remains within the required confines of its biological immediacy while using something inside that biological immediacy—the material of language—to represent something beyond its biological immediacy.
The material of language can be almost anything—humans began with abstract gestures and vocalizations, and have recently adopted materials that can be touched and more widely seen or heard. What is transformational and important about language is not the material itself but the elements and structure being represented—events of time, space and other non-biological conceptualizations. Although an organism can never directly engage such elements, it can use its belief in the value and accuracy of the language representing such elements to change the course of its own biological immediacy. And thus it is that humans have in large part separated themselves from their evolutionary animal past.
If one wants to understand the origins and structure of language, one does not focus on the material of language, since that material itself is almost completely arbitrary. (And one goes even further afield to focus on mentalizations or brain processes, since mentalizations are little more than the re-creation of the arbitrary materials of language.) If one wants to understand the origins and structure of language, one focuses instead on the non-biological elements being represented, and wonders about their sudden recognition and their peculiar form. After all, this planet passed more than four billion years without any species ever considering time, space or any other non-biologically immediate concept. And humans too, they passed hundreds of thousands of years completely oblivious to anything outside their biological immediacy.
Keep in mind: that which is far removed from biological immediacy is also far removed from normal biological perception.