Sunday, August 21, 2011

Deidre Barrett—Supernormal Stimuli: How Primal Urges Overran Their Original Purpose

“Our modern skulls house a stone age mind,” say psychologists Leda Cosmides and John Tooby, quoted in Supernormal Stimuli (27). The book's title comes from the work of Niko Tinbergen, the Danish ethologist whose work on animals’ instinctive reactions to external stimuli won him the Nobel in 1973. Tinbergen discovered that much of animal behavior is cued to very particular stimuli—geese are so programmed to sit on eggs with certain markings that they prefer a volleyball to their own eggs if the markings are enhanced (say, black with bright blue spots rather than grey with pale blue spots). The male stickleback fish is so enraged by the red color of other males’ chests that he will ignore actual male fish to attack a bright red ball, or even a passing red van outside the window.

These enhanced versions of evolutionarily developed triggers are, in Tinbergen’s words, “supernormal stimuli,” and Barrett’s book develops the thesis that we are similarly triggered by certain stimuli, evolved over millennia on the African savannah, and that we remain the victims of our stone-age reactions—particularly since we have used our modern skulls to devise a wide array of supernormal stimuli that end up being actually bad for us. The clearest example is our taste for fatty and sweet foods, which had an obvious logic for hunter-gathers but becomes a severe liability in a world of McDonald’s cheeseburgers and chocolate shakes.

Less well known is the interplay between television and what Pavlov termed “the orienting response”—the instinct to pay rapt attention to any new aural or visual stimulus. In Barrett’s description: “The orienting person or animal turns eyes and ears in the direction of the stimulus and then freezes while parts of the brain associated with new learning become more active. Blood vessels to the brain dilate, those to the muscles constrict, the heart slows, and alpha waves are blocked. . . . The effects persists for four to six seconds after each stimulus.”

Television’s increasingly rapid-fire technique of quick cuts plays right into this, essentially paralyzing us on the sofa as our stone-age brains try to process the flood of supernormal stimuli. Eventually, however, the body slips into a lower state—hypnotized but no longer alert, and the metabolism actually drops to a lower state than if one were simply lying in bed.

The concept of the supernormal stimulus explains a great deal of modern behavior. Alcoholism and drug addiction, for one, as well as the tendency of our society to treat as an addiction any compulsive behavior (gambling, promiscuity, overeating) that exceeds the rational mind’s ability to regulate it.

What Barrett doesn’t engage sufficiently, to my mind, is the role of both technology and corporate capitalism in the development of these supernormal stimuli, and while she urges us to begin to restrain and regulate the most destructive of these stimuli, she doesn’t touch the issue of what this means for the twin sacred cows of free speech and private enterprise—both of which presume a rational mind capable of regulating its stone-age instincts without exterior help. Already in ancient Athens, Aristotle pondered why individuals pursued self-destructive behavior, and that was long before there were billions to be made by marketing products scientifically designed to encourage that behavior. The Athenian vices of excessive wine and the occasional hetaera pale before crack cocaine and 24/7 online streaming pornography, or even the less obviously pernicious burger and fries and video gaming.

Wednesday, April 27, 2011

Twitter vs Facebook

According to a college newspaper I was just browsing, there are two types of students—those who use Facebook, and those who use Twitter, and “when the second type of student communicates with the first type, it’s like a teenager talking with their grandparents.” This suggests that I, who don’t use either media, am probably somewhere close to death, at least as far as social networks go. Which is fine, really—being archaic is always relative, and blogging from beyond the grave may be one of the last frontiers left to us. I’m imagining a great B-movie about it, but the college editorial was no doubt right: the idea sounds dated already. Much better: To Sleep, Perchance to Tweet, or maybe The Undiscovered Keyboard . . . "The Network from Whose Botnet No Twitterer Returns"

Thursday, March 31, 2011

Walker, Texas Ranger — NCIS

I think it’s important to have a TV show that you watch at least semi-regularly, and preferably, one that’s not very good. No high-end Mad Men, Sopranos, or The Wire here, but rather something so basic and primitive that it’s like walking out into the vast cultural wasteland of television, choosing an ordinary, mid-sized rock, and hitting yourself over the head with it.

No program was better in this regard than Walker, Texas Ranger, a show so elemental in its structure that it was virtually indistinguishable from the Lone Ranger reruns I used to watch as a child, and from which it was clearly descended. Walker ran a full hour, but there was still only one plotline per episode, with no subplots or side action to obscure the sharp edges of the story. There was even less moral ambiguity—the good guys were 100% good, the bad guys (insofar as you even saw them) were 100% bad, and the only moment of doubt was whether good was smart enough, quick enough, and strong enough to stop evil before it had made serious inroads into the community. Though, of course, it always was. With his face reflecting a weathered gravitas, Walker looked as if he had seen great evil, but that he could always be counted on to deliver it a roundhouse kick to the chest and march it off to jail. The Rangers fought evil with the implacable resolve of the Norse gods, yet without any hint of that troubling Ragnarök.

About a year ago, sick with the flu, I steeped myself in an NCIS rerun marathon and recognized it as the true inheritor of Walker’s legacy. Despite the larger and more eccentric cast, the show maintains the same simple moral clarity, and manages to do so even when it treads into much more suspect territory (like when everyone leaves Ziva, the Mossad operative, alone in a room with a suspect to extract information in ways unspecified, undocumented, and certainly illegal under American law—it's extreme rendition in miniature). Whatever they’ve glimpsed of the dark side in the course of the investigation falls away, and by the end of the episode the gang is squabbling and teasing each other like tweens on their way home from a field trip. It is, perhaps, the most infantile example of its entire genre, and this combination of serious police procedural and ridiculous adolescent hijinks led me to think I’d really found an obscurity, something that remained stuck to the side of the TV barrel mostly because nothing had been developed yet to take its place. Recently, however, I read on the cover of a supermarket checkout magazine that NCIS is the most popular TV show in America, and this has totally ruined it for me. I’m now in search of another rock to hit myself over the head with.

Thursday, March 24, 2011

David Lewis-Williams & David Pearce—Inside the Neolithic Mind: Consciousness, Cosmos, & the Realm of the Gods

When Czech president Vaclav Havel spoke to Congress in 1990, his line about how “consciousness precedes being, and not the other way around, as the Marxists claim,” garnered him great applause from the assembled senators, though I expect almost no comprehension since probably no more than a handful of them have the slightest grasp on the argument that rages from Hegel through Marx and onward about the true nature of the engine driving history.

I thought of this “which-comes-first-the-chicken-or-the-egg” debate again while reading Inside the Neolithic Mind. On the one hand, the authors seem to be clear materialists: they quote from Marx at several points (though they’re just as clearly not slavish Marxists), and they base all their assumptions in the limited material record that comes to us from the Neolithic, appended by anthropological and neurological studies from the present.

On the other hand, they stand the usual anthropological understanding of the Neolithic on its head. Most archeologists (I gather from their book) posit that humans built the enormous Neolithic monuments of Europe (Stonehenge, Newgrange, and others) only after they had already developed agriculture and begun settling down. Lewis-Williams and Pearce argue the opposite—that humans built these ritualistic holy sites first, and only as a result of that began to develop settlements and agriculture around them.

This would seem to be along the lines of Havel’s claim that our spiritual lives precede and dictate our material existences, except that Lewis-Williams and Pearce’s claim isn’t a spiritual one, but rather a material one. Humans constructed mound graves and stone circles out of an attempt to recreate and codify a neurological experience—the experience of the mind in altered consciousness. Whether through madness, fasting, sensory deprivation, drugs, or other extreme states, our brains tend toward hallucinations, and these bear a striking similarity across chronological and geographical distances. With a similar universality, we tend to grant these experiences a religious aura.

In short, we’re physically hard-wired for spiritual experience, almost as if we had a Religion-Acquisition Device that paralleled Chomsky’s Language-Acquisition Device. Or, to put in another way, we're chickens who imagine that we're the egg.