Posts Tagged ‘Ernest Becker’

h1

Run for Your Life

April 24, 2013
"k1f" Kirby Farrell

“k1f” Kirby Farrell

Understanding the fascination of the marathon terror

Why do terrorism and rampage killing exert such hypnotic fascination?

Attacks on schoolchildren and marathon fans arouse “horror.” But the actual carnage (5 deaths, many injuries) is eclipsed by Massachusetts’ 334 routine auto deaths (California and ten-gallon Texas rack up an annual 3,000+ each). The news doesn’t even try to report them all, and they certainly don’t hold viewers spellbound for hours of monotonous, inconclusive TV reporting.

Some answers are in plain sight. Terrorism is spectacle. The attacks are designed to catch maximum attention. Many rampage murders are copycat crimes. The killers are in theory “berserk” and out of control, yet often evidence shows they’re aware of, and competing with, previous sensations.[1] This was true of the Columbine killers, and even Adam Lanza, the Newtown killer, compiled data on past rampage slaughter as you might collect baseball statistics. Terrorists deliberately design displays of spectacular death. How excited bin Laden must have been to know that on US TV “his” airliners were still regularly exploding into the twin towers weeks and months after the event. For the terrorist, that film clip powered an ad campaign that changed mental landscapes like the famous Macdonald’s jingles that toddlers recite by heart.

In this respect the Tsarnaev brothers were drawn into the creative mania that brought Barnum and Bailey to your town, or planned the Roman amphitheater’s gala gladiatorial combats, criminal executions, battle reenactments, and “hunts” that slaughtered hundreds of wild animals. We regard those Roman crowds as bloodthirsty primitives, yet terrorist spectacle appeals to the same motives. There’s pity and terror galore, spiced by the insinuation that the danger in front of your eyes is unique, unprecedented, possibly out of control. It could happen to me. The violence challenges police, insurance, doctors, criminal justice: all the technologies that we use to protect ourselves from the reality that we too can—in fact definitely will someday—die. In our imaginative participation, the story onscreen is about “trying out” death or playing dead and being rescued back to life. In the safety of your living room the events onscreen are vicarious and only half-real. But the emotions are doing real work in reframing what’s possible in life and reassuring you that you’re safe.

As the drama onscreen unfolds, our daily grind gives way to vicarious shock at the possibility that we too can die in a freakish moment. But the coverage is staged to allow us to participate vicariously in  heroic rescue from death by hunting down the criminal. The hope is that “one of the biggest manhunts in American history” gives our lives memorable significance. State Rep. John Scibak claimed that the attack “was a different act of terrorism than we’ve seen before,” and “literally and legitimately” shut down Boston. Note that “legitimately”: the word tells us that emergency reactions can be routine false alarms.

But the spellbound spectators aren’t “shut down” at all. “We” form a vicarious posse invited to contribute information, but also caught up in the hunt for an enemy who is now our quarry. Pictures of armed paramilitary police fill every screen. “Suspects” come into focus; overnight they’re pursued into a firefight, killing one enemy and one cop and wounding another officer. There’s vindictive satisfaction in this hunt. [2] We want to believe that death and terror make sense: that an evil someone has violated the sense of “what is right” on which we’ve grounded our personalities since infancy. The killers’ outraged uncle tells the media that his nephew “deserved to die.” Vengeance promises to restore “what is right” even as death has injured it. It’s how we’re built. We demand, and get, “reasons” to reinforce our account of reality.

Finally “we” have the surviving criminal holed up in a boat, wounded, and then in custody. The news reports that he was a sociable college kid, perhaps under the influence of a more hostile older brother. He was also apparently flunking out of school, so the fear of failure and social death may have been a cause of, and/or result of, the lure of terrorist ambitions. All told, a pitiful as well as cruel character.

Media carry our curiosity about criminal motives to Chechnya and, gingerly, to the “war on terror.” Op-eds suggest that the torment of Chechen refugees may have resonated with the Tsarnaev brothers. Whatever the motives, we have on our hands now a pathetic kid whose life is over, even if the we don’t kill him. He enters the poisonous cloud of self-justification and loathing that surrounds infamy.

So the logic of maiming and death ripples outward into the absurd mysteries of denial. In one direction the absurdity of the Marathon as hunt connects to the Euro-American quest for access to the world’s oil riches implicated in so much 20th century bloodshed and still raising havoc in the middle east and the Caucasus. About the time the young Tsarnaev was born in Kazakhstan, I was there doing some work for the Peace Corps and overhearing US officials discussing Kazakh sour crude oil. It was just the beginning.

A recent investigative commission reminds us that the Bush-Cheney “war on terror”not only terrorized Iraq on false pretenses, but also took us into the criminal practice of torture. We know that for many in those anguished regions of the world, Americans have a reputation for viciousness, even if we don’t yet know how that reputation may have affected the Tsarnaev brothers.[3]

But there’s another dimension to this absurdity. Pathetic young men ambitious for heroic self-esteem place ridiculous pressure-cooker bombs at the finish line of a race built around the human need to test the ability of our bodies to outrun death. Paleontologists speculate that early humans found evolutionary advantage in walking upright and the ability to run long distances as effective hunters. Warrior-hunters have long identified with jaguars and tigers. In this context, the marathon is a survival contest, and wittingly or not, the terrorists were joining in, and exploiting, this atavistic hunt.

Step back from the hypnotic anxiety of exploding pressure-cookers, and you begin to sense the pathetic inadequacy of our public chatter to account for the moral daze and tragic suffering of criminals and their blind victims joined in a moment of contemptible folly that cries out for clarity and compassion.

1. For an account of this mentality, have a look at my Berserk Style in American Culture (2011).

2. In Escape from Evil, Ernest Becker argues that human violence springs from our uniquely human death-anxiety. The Ernest Becker Foundation’s website assembles indispensable resources for thinking about violence.  <<www.ernestbecker.org/

2. http://readersupportednews.org/opinion2/357-europe/17036-chechen-terrorists-and-the-neocons

h1

Rise of the Planet of the Apps

March 2, 2013
"Svaardvaard" Bill Bornschein

“Svaardvaard” Bill Bornschein

Our local paper recently featured a story about our zoo’s cutting edge program that gives iPads to our orangutans, Teak, Bella, Segundo, and Amber. The stated purpose is to provide mental stimulation that helps prevent boredom and depression. Without the slightest sense of irony, the zoo officials go on to explain that “Freedom of choice is critical to their well-being.” Good grief, of course they are bored and depressed. They are prisoners. If the doors were opened I doubt they’d stick around to play with the iPads. A friend suggested that for amusement the orangutans be shown Rise of the Planet of the Apes. Or was that Planet of the Apps?

As I reflected on this my mind turned to my own students, iPads in hand as they chart a brave new world of education. As with the apes, many students have regarded their formal education as a form of imprisonment. No less an authority than Professor A. Cooper gave expression to the general sentiment in his seminal work School’s Out For Summer.  Indeed, like prison, the reward of graduation is to finally “get out.” I suspect this may be as true for doctoral candidates as for high schoolers. Of course, the iPad is being promoted as a form of engagement, which it certainly is. But to what degree is it liberating? We have probably all read critiques of our cyber-age and the anomie it engenders. Are we happier, more fulfilled, living better lives than before? Is cyber community a complement to or a substitute for earlier forms of community?

Framing this in a Beckerian perspective, do the iPads reflect a genuflection at the altar of our new god Technos who has stepped into the void created by the demise of standard brand religious and cultural myths?  Facing myriad problems, insofar as we place faith in anything, we seem to place our faith in technology, even for our orangutans. One of Becker’s most compelling images is that of the philistine, which he borrowed from Kierkegaard. The modern philistine loses himself in the minutia of the daily routine and thereby represses existential anxiety. Does our technology allow us to maintain a distracted existence? Watching my students jump around the internet on their down time, that seems to be the case. I’m always mentally comparing these students to their predecessors in terms of their general awareness of the world and the issues we face. I should hasten to add that I am not talking down today’s  kids. They are simply what I see before me and I suspect the same distracted consciousness exists across generations.

I recently read The Lost Art of Reading: Why Books Matter In A Distracted Time by David Ulin.  He makes the case for reading as a type of depth experience that leads to both a clearer understanding of self and at the same time a sort of negation of self as we turn the reins of our consciousness over to another. It is this sort of dialectical inner conversation between the author and the reader that produces the consciousness of an avid reader, at once adventurous and discriminating. He goes on to analyze the availability of such an experience on iPads, Kindles, and the like. I return here to the freedom of choice offered to the orangutans on their iPads. What is the difference between amusement and insight, between accessing data and entering the mind of another?  Like the orangutans, we are prisoners of a sort, confined by our mortal condition. Unlike the orangutan, our boredom and depression—and I would add here our self-conscious anxiety—is unlikely to be assuaged by the amusement the iPad provides. No, we are a different kind of prisoner and require something deeper. This may seem completely obvious, but I see people losing themselves in the surface components of technology as surely as Kierkegaard observed the philistines of his day. As we chart a course forward, how will we mediate technological change so as to live rich and productive lives? For me, it will entail a self-conscious effort to continue in-depth reading as well as pursuing in-depth conversations that involve eye-contact. Without such a self-conscious effort we may find ourselves more and more like the orangutans, prisoners of a lower order. Okay, enough of that. It’s time to check Facebook.

h1

Flu Shot Resistance – A Symptom of Death Denial?

February 14, 2013
TDF Guest Scott Murray

TDF Guest Scott Murray

Cornell psychologist Thomas Gilovich has made a career out of studying the cognitive processes that sustain dubious beliefs. There is something about flu season that makes me wish I could combine his work with that of Becker and enlist their aid in a cultural intervention. It seems there is a robust resistance to influenza vaccination despite the potential risks of the flu – a subject rife with extensions into Gilovich’s work. If the flu is so dangerous, what dubious beliefs inspire resistance to vaccination? And a glance at Beckerian thought might raise the question of why the fear of death wouldn’t drive more people to get vaccinated. But a deeper understanding of what death denial actually entails helps inform us why many resist the shot.

We’ve had rather mild flu seasons the last year or so (http://news.yahoo.com/why-years-flu-season-bad-173457272.html), but as statistics readily predict, occasionally influenza strains are more potent and many cases of serious flu infection can occur. The cost in dollars – and lives – can be quite serious. So why do some resist influenza vaccination?

Certainly the media, with its drive to fill air time and sell stories that attract attention, is part of the problem. Flu hype sends millions scrambling for a vaccination each year – but the media is known for its hyperbole, slanted use of facts, and even fear-mongering. It’s no wonder, then, that many cautious, critical individuals would see reports of this year’s raging epidemic and dismiss the need for a flu shot. But as Gilovich is insistent in pointing out, we tend to be most critical of that information which does not agree with the beliefs we already possess. As he puts it: “when the initial evidence supports our preferences, we are generally satisfied and terminate our search; when the initial evidence is hostile … we often dig deeper.” (82) In other words, those who suspect media hype and doubt the need for a flu shot may already believe that a flu shot is unnecessary, and be looking for reasons to support this belief.

Is a distrust of the media, with their tendency to overstate or distort facts, sufficient to explain dismissal of flu season severity? Becker might disagree. Cognitive biases are in one respect no different than elbows and thumbs: they’ve evolved to serve a purpose, and where Gilovich leaves off, Becker nicely steps in. Gilovich is a laboratory psychologist, a statistician; he resists the temptation to speculate too grandly on why the human brain comes hard-wired to accept information that supports existing beliefs while remaining doubtful of information that does not. Becker’s focus on death anxiety provides a compelling furtherance of this line of questioning.

Disease and death are deeply connected concepts. Disease is a limiting of life and brings the carefully repressed reality of our mortality painfully to the foreground of our thoughts. It’s no surprise, then, to discover a deep desire to believe that the flu shot is essentially needless. The CDC, in addressing the most prevalent myths that prevent influenza vaccination, have to deal with the ‘argument’ that influenza is only serious for those who are very young, elderly, pregnant or otherwise already ill. Your average healthy person can deal with influenza using natural means – namely, the immune system (http://www.cnn.com/2013/01/11/health/flu-shot-questions/index.html).

The challenge with this argument is that it isn’t untrue; in fact, the majority of people who come down with influenza nowadays – certainly in the more affluent Western world — survive relatively unscathed. But the problem with the argument is that it isn’t actually an argument against vaccination at all. Just because you can survive the flu does not mean you shouldn’t avoid it, if not only for yourself than for the basic public health service of not acting as a transmitter of the virus to someone who is more at risk of hospitalization or even death due to influenza. So the question persists: if vaccination serves yourself and the community, why resist it?

There are other studies being conducted on what is termed ‘naturalness bias’ (like this one at Rutgers: http://mdm.sagepub.com/content/28/4/532.short). The result of the study is essentially that some people make bad decisions because of a cognitive bias towards means that feel more ‘natural’ – in other words, if you tell them a kind of tea is a great counteractive agent to influenza, they will drink it, but if you suggest that they get vaccinated, they will find all kinds of reasons to dispute you. Tea is from nature; vaccines are a mysterious, dubious concoction brought to you by the same science that supported cocaine as medicine and DDT as pesticide.

So all kinds of reasons exist to doubt the flu vaccine. What’s in the flu shot, anyway? Many believe that the flu shot can actually get you sick, which is itself a medical misunderstanding involving a combination of known statistical nemeses. For starters, a large number of influenza strains and influenza-like viruses exist, and the vaccine can only protect you against so many. Some argue that this makes the vaccine worthless, when in fact it means that the vaccine is only as good as the severity of the strains it actually protects against. Those concocting the vaccine work hard to ensure it protects against the most severe strains predicted to be active in any given season. Nevertheless, the likelihood that some who are vaccinated become sick soon after is statistically guaranteed given a large enough sample, either because the vaccination didn’t come in time or because those unlucky few caught something the vaccination could not defend against. To these few (a minority, particularly if you establish a clear window of time after vaccination where sickness constitutes a possible response to vaccination), it is not surprising that they feel cheated and suspect that they’ve been had — not only by flu hype, but by flu shot hype.

The other factor in play has to do with anecdotal evidence and word of mouth narrative; it’s a simple tale of how everyone knows someone (who knows someone?) who has ‘gotten sick from a flu shot.’ The CDC is very candid about why this can seem to happen to any who cares to actually read up on it: (http://www.cdc.gov/flu/about/qa/misconceptions.htm).

In the light of these cognitive factors that influence these behaviors (influenza vaccination denial and the anti-hype hype), is there a motivational determinant that can be identified? Becker’s theories outline what Gilovich’s work suggests: something at work deeper than cognitive tics, deeper than the prevalence of poor science in pop culture. Isn’t it safer for the psyche to believe that disease is less of a threat? That the body has what it needs to defend itself against death? Don’t flu severity denial – and the cognitive factors that help sustain it – work in the service of protecting the embodied self against the reality of its own fragility?

Sources:

Gilovich, Thomas. How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life. New York: Free Press, 1991.

h1

Who Needs Humanities?

February 3, 2013
"Leucocephalus" Phil Hansten

“Leucocephalus” Phil Hansten

It is the debate that never dies. With finite (and often dwindling) resources for university and K-12 education, legislators and educators make cuts. Sadly, we invariably round up the usual suspects and haul them off to the gallows: music, art, theater, philosophy, literature, and languages, both ancient and modern. The executioners mean well, but they ignore a basic truth: science and technology without the humanities (i.e., knowledge without wisdom) is a recipe for societal disaster.

The controversy is not new. Over a century ago Ambrose Bierce (1842-1914?) wrote a marvelous essay entitled, “An Historical Monograph Written in 4930.” (free on iBooks, etc.) Bierce is most famous for his acerbic but hilarious book “The Devil’s Dictionary” but in this essay he pretends to be writing in the year 4930 about the demise of “ancient America” which he says fell apart in the 1990s. He may have been a few decades early on the timing, but his analysis of the etiology of America’s final decline is stunningly prescient. Bierce cites two primary causes:

The Politics of Greed. Bierce identifies selfishness as a fundamental motive of human behavior, and describes how it destroyed American politics: “Politics, which may have had something of the contest of principles, becomes a struggle of interests, and its methods are frankly serviceable to personal and class advantage.” This is precisely what is happening in America today, in which the plutocracy is in control, and our political system has degenerated into a dysfunctional charade. There is little nuance… little insightful analysis… little genuine concern for humanity… just predatory self-interest punctuated by paroxysms of self-righteous demagoguery and rhinocerine obstinacy.

Prosperity at All Costs. Bierce then turns to the related issue of our unflagging focus on wealth and prosperity as our raison d’être. In a magnificent passage, he describes what “ancient America” ultimately sacrificed to achieve its mercenary goals:

“It is not to be denied that this unfortunate people was at one time singularly prosperous, in so far as national wealth is a measure and proof of prosperity. Among nations it was the richest nation. But at how great a sacrifice of better things was its wealth obtained! By the neglect of all education except that crude, elementary sort which fits men for the coarse delights of business and affairs but confers no capacity of rational enjoyment; by exalting the worth of wealth and making it the test and touchstone of merit; by ignoring art, scorning literature and despising science, except as these might contribute to the glutting of the purse … by pitilessly crushing out of their natures every sentiment and aspiration unconnected with accumulation of property, these civilized savages and commercial barbarians attained their sordid end.”

This remarkable prophecy so accurately describes our current sorry state that it is almost as though Bierce were a time traveler who observed the America of today, and then traveled back a century to write about it. Bierce was basically arguing that, in order to remain civilized, societies need the humanities as a counterweight to the hegemony of technology. I think he was correct.

This balance is important for individuals as well as societies. One need go no further than Ernest Becker to see what happens when a person with a first rate mind has a deep, organic understanding of both the sciences and the humanities. If Becker had been “just” a talented anthropologist, or if he had been “just” extremely well read in philosophy, literature, religion, and sociology… he would now be well on his way to intellectual oblivion. Instead, his remarkable synthesis still resonates for us in the 21st century.

Now, one must admit that science and technology have provided humanity with substantial benefits in medicine, agriculture, engineering, and many other fields. So it is not an “either or” proposition; it is not that we need humanities instead of technology. The problem is that we have become seriously out of balance; our power over nature is out of all proportion to our humane tendencies and moral sensibilities. Neil Postman of New York University clearly perceived the Janus-faced nature of technology when he said, “Reason, when unaided and untempered by poetic insight and humane feeling, turns ugly and dangerous.”

So, will preserving the music program at your local high school save humanity? Not likely. Nonetheless, if a critical mass of people around the world recognized that our single-minded worship of technology is on course to destroy civilization, we might be able to clear a bit of room for the humanizing and tempering influences of the humanities. I admit that the truth of our situation is unsettling, but as Emerson said, “God offers to every mind a choice between truth and repose. Take which you please—you can never have both.”

h1

AT&T and the Yang Complex

January 10, 2013
"Svaardvaard" Bill Bornschein

“Svaardvaard” Bill Bornschein

Have you noticed the recent AT&T commercials that feature an adult asking leading questions to children, questions that have “obvious” correct answers? Which is better, big or small? Which is better, fast or slow? These questions serve the purpose of the ad  but also reveal a disturbing aspect of our culture. The immediacy of the children’s answers and the “no kidding, duh” tenor of the commercial reveal a pretty unreflective public, at least in the view of the advertisement’s creators. If we pause for a second to apply the same questions to other subjects  such as cancer or melting glaciers, we get some different answers. Which cancer is better, big or small? What rate of glacial melt is preferable, fast or slow?

Besides revealing a dim view of the public, the ad’s popularity confirms the “truthiness” of that perception. In other words, the knee-jerk response critical for its effectiveness bespeaks our real world perception, our socially constructed reality. This is what I refer to as our “yang complex,” our western preference for the yang elements in the yin/yang dichotomy of Taoism. Yang roughly translates as “the sunny side,” and I am reminded of the American standard Keep on the Sunny Side. Yang qualities include fast, hard, solid, focused, hot, dry, and aggressive. It is associated with the male gender, the sun, sky, fire, and daytime. Yin roughly translates as “the shadow side.” Yin qualities include slow, soft, yielding, diffuse, cold, wet, and passive. It is associated with the female gender, the moon, earth, water, and nighttime.

The yang complex AT&T plays off of reminds me of the critique that  Ernest Becker offers of Norman O. Brown’s unrepressed man, the archetype for many similar New Age visions. AT&T’s approach puts us on the threshold of a very Beckerian question: “Which is better, repression or unrepression?” Rejecting this false dichotomy, Becker maintains that repression is necessary and it is the way in which the repression is managed that is crucial. As it stands, repression is a dirty word in our popular culture. Indeed, unrestraint is the premise of modern consumer culture. The yin qualities we need for balance are present but muted. I maintain that our cultural yang complex puts us in a dangerous position. Bigger, faster, and more more more have become the watchwords for a growth curve that is clearly unsustainable.

Yet, the lack of a yin perspective in our political discourse reveals the depth of the culture of yang. We appear unable to envision anything new, still opting for the obvious answers, just like the kids in the AT&T commercial. We need a new mythology, a new story that admits the insights of Becker, Rank, and Kierkegaard where the answers are not so obvious.

h1

We Give Birth Astride a Grave

November 25, 2012

TDF Guest Kim Pereira

I’ve been fortunate never to have experienced the crippling effects of chronic depression, although I’ve had my share of despair where the world seemed to spin out of control and even sleep, if it could be had, offered no respite. Like so many others, I have known late nights sprawled on a couch, clicking the hours away, remote in hand, finding nothing worth watching, relying on endless checking of meaningless Facebook updates to authenticate the feeling that I’m not alone in this spiraling vortex. Are we that obsessed with distracting ourselves? And if so, from what, I wonder? Being alone? Not alone by ourselves, though that too, but alone with our thoughts. Do we fill our days with sidetracking paraphernalia to avoid the agony of introspection which would inevitably lead to the central question of our lives—our mortality? The consuming question is this: are our lives being driven by the specter of death? Is that behind our attempts to fill every moment with something, anything, just to avoid confronting the only truth of which we are certain? Are we trying to divert ourselves from ourselves? What fears lurk beneath the urgency to keep busy, rushing through manufactured tasks and mundane events, going to church, jamming trivia into our days as though they were somehow meaningful, bestowing significance on them by virtue of our attention, deluding ourselves by clinging to an impuissant work ethic, a hand-me-down from past generations that prized it above all else? We save very little time for vacations, have the least number of public holidays of any country, and when we do take a break we’re lucky if we find time to enjoy a sunset!

When I first read Samuel Beckett’s anguished cry (the title of this essay) from his seminal work, Waiting for Godot, the full implications of it didn’t really strike me, although it vaguely echoed Thoreau’s “lives of quiet desperation.” Since then my fascination with the so-called Theatre of the Absurd has offered insights into the world as I traveled across the globe, making more sense of what I saw than I cared to admit. Looking through literature I found those sentiments reflected in other works—the sacrifices of “scapegoated” heroic figures from Shakespearean and Greek tragedies seek transcendence from the desolation of the human condition; comedic structures attempt to reorder societies by changing the rhythms of the old world to fit the melodies of youth, exchanging a past for a new future that will in time become replete with similar characteristics that made the past so unpalatable! The tortuous cycle continues.

These depictions offer only glimpses of hope as they strain against overwhelming realities. King Lear ends in a world as hopeless and barren as can be imagined, prompting Beckett to use the king and his fool metaphorically in Endgame, that entropic image of a world we have destroyed. For all his eternal optimism, Dickens’ brilliance lies in his excoriating portrayals of industrial London where little orphans were swallowed by rapacious thugs lurking in alleys and the trammels of England’s Courts of Chancery ensnared families for decades! Any journey through his pages cannot ignore the glowering pessimism of the nineteenth century and the crushing weight of factories and machines leading to disenchantment and inertia—Chekhov’s characters are unable to rise from their indolence and go to Moscow, preferring to yearn for what will never come, for longing is their raison d’etre, replacing vibrancy with torpor, banishing the cherry orchard to the lumber yard as commercialism and utility sweep away beauty and grace, false notions anyway, having been acquired on the backs of serfs and the underclass!

The machines have changed, but the effects persist—digital technology, once hailed with the same enthusiasm as the industrial machinery of the nineteenth century, is viewed with growing suspicion as intrusive, leading to questions of privacy and loss of identity, reducing us to disembodied voices on answering machines or twitter feeds. The ability to stay in contact with a swath of people robs us of the desire to do so; if it’s always available it loses its urgency, without which we drift into isolation, cocooned by the minutiae of every day, paralyzed by the burden of trying to authenticate our existence.

We don’t purchase products any more—they buy us!! Walking through shopping aisles we are attacked by displays that scream for attention; row upon shiny row leer as we walk past, similar to walking through a jail cell corridor showered by abuse from inmates on either side. Under relentless pressure we succumb, reach out and grab one; maybe we even read the specifications on the container to delude ourselves into believing we made a wise choice. But the selection is quite random, for the plethora of available products deadens our ability to choose. With our identities eviscerated by years of slavery to the gods of marketing strategies, we don’t know who we are, much less what we really want. We are now on sale, possessed by our possessions—clothes, household products, food, holiday resorts, TV channels, movies, books, and everything else; we have to take what we are given. They choose us!

There’s a moment in Ionesco’s Bald Soprano where the Smiths are discussing Bobby Watson’s death, reported in the newspaper; as the conversation proceeds we realize every single one of Bobby Watson’s relatives is named Bobby Watson. Of course, the amusing irony is that a couple named Smith is making this discovery. Keeping up with the Joneses (or Smiths) has turned us into Joneses, with the same houses, same clothes, and the same status updates. Look through the myriad photographs on Facebook and after a while the pictures of kids, families, and vacations devolve into endless cycles of sameness. We are all Bobby Watsons, interchangeable and alike, indistinguishable from one another, with nothing meaningful to say. Our political and social parties and FB profiles are vain attempts to distinguish us but very little there is unique.

Towards the end of The Bald Soprano dialogue descends into gibberish, culminating in a show of aggression. Failure to communicate leaves violence as the only option, for we desperately need something to remind us that we’re alive, that we still live in a social world, but all we have are anger, violence, and the instinct to survive—witness global skirmishes and full-fledged wars, ethnic cleansing, government sponsored torture, terrorism, street violence, political discourse and TV discussions that are shouting matches, mass murders in villages, movie theatres, and even temples! The jungle paths to destruction have always been available, obscured sometimes by the underbrush of forced civility, but easily accessible when survival is at stake.

This is the Age of Saturn, a sullen, scowling time where malevolence permeates the republic and millions of bloggers can pen their unfiltered thoughts; where politicians lie with impunity and truth is lost among thousands of commercials; where we cling to worthless promises because we’re desperate to believe someone cares about us; a time of distrust, skepticism, and fear! Music now finds its greatest audience only through competitions and the hushed loveliness of verse has descended into poetry slams. A hundred years ago Expressionists rebelled against the dehumanizing effects of an industrial age, distorting reality to give vent to passions and feelings, releasing their creative streams unencumbered by constraints of logic or order; they themselves were victims of an emotional angst that settled upon Europe before exploding in a massive conflagration across the continent. It almost seems like we have come full circle ten decades later. In the new reality of this century, language and relationships have been compressed and distorted—140 characters (as random as the work of Dadaists) are enough to say what we feel and the social networks of cyberspace have supplanted front porches, parish halls, and even playgrounds. We can now count the number of “friends” we have right there in the left column and on the right we know exactly what we “like.” And still we are alone. Waiting…

Earlier I alluded to the fear of death being the driving force behind much of what we do, suggesting that much of what we do are merely distractions designed to keep us from contemplating the void. Perhaps the most successful thing is to make it through each day. When one considers the absolute haphazardness of life (people dying accidentally or being stricken with fatal diseases—who among us doesn’t know someone we love in this situation?) it’s a small miracle we are alive at any moment! Despite the winding down of the world in Endgame they are left with the possibility of tomorrow; Godot never comes but Everyman still waits on the empty road; we persist. The original French title of the play is En Attendant Godot, WHILE waiting for Godot! Like Beckett’s tramps we find ways to amuse ourselves while waiting for the end—we play games, make art, have sex, get drunk, persevere in our jobs, convince ourselves that death is not tomorrow, and do a million things that slap our faces to wake us to the fact that we’re alive.

Dr. Kim Pereira is Professor of Theater and Director of the Honors Program at Illinois State University.

h1

Longevity

October 27, 2012

“The Single Hound” Bruce Floyd

This article neglects to understand one of the basic tenets of Becker: increasing the longevity of life will do nothing to ameliorate the terror of the human predicament; in fact, it will exacerbate it, take anxiety to new heights, paralyze the will, make an early death even more “absurd” that one is now. This clamor, all this hue and cry, for the extension of life confirms Becker’s insights. The length of life means nothing to the self-conscious creature. All it knows it that he or she will die. What difference does it make whether it’s seventy years of one-hundred and fifty?

The problems that plague humanity, the existential ones, are not to be solved by insuring people they will live longer. No, what will console the mortal is some way of accepting the limitations of life, of coming to terms with life and its limitations. We must give in, even embrace, our fate. It’s hard to do. Of course it is, and that’s why we swim in an ocean of illusions.

The coming age of longevity will not change everything; it will just make the time-immemorial paradox more acute and baffling.

 

h1

Peak Complexity, Peak Oil, Peak Terror

October 17, 2012

“Svaardvaard” Bill Bornschein

I was fortunate to hear University of Washington professor Phillip Hansten’s talk at the recent EBF Fall Conference. His topic was what he refers to as “premature factulation,” which he defines as “the process of coming to conclusions without adequate study or contemplation; usually applied to complex concepts or situations.” The diagram below represents the basic dynamic.

Premature Factulation

Hansten’s book Premature Factulation provides many examples of this process as well as prescriptions for dealing more successfully with complex issues. Peppered with quotes from great thinkers across the ages, this book has the synthetic feel of Becker’s work and I recommend it.

As I looked at the Hansten diagram, M. King Hubbert’s famous oil depletion curve appeared in my mind’s eye. Hubbert’s Peak, as it is called, explains that the rate at which oil can be extracted follows a Bell curve and further, calculates that such a peak has already been reached. Indeed, global production of oil reached its apex in 2005. The economic downturn that followed shortly thereafter reflects our inability to project future growth based on cheap energy. Economic contraction is the new reality, regardless of which political party is victorious or what the consumers want. The laws of physics seem to stand quite independent of human desire.

As I compared the two models, the first thing that struck me was the role of complexity in the first half of the curve. The dramatic growth curve that has accompanied the age of oil really took off in the post-WWII era with the Green Revolution quadrupling global population and the advent of modern consumer culture more than quadrupling our, for lack of a better term, stuff. Indeed, our Apollonian ascent, which has provided the basic template for progress and heroism, has been fueled by vast amounts of cheap, easy-to-get oil. As we pass through this gate of history, we bear witness to the consequences of the collapsing hyper-complexity, from our undecipherable financial instruments to our interconnected food and energy delivery systems, to our far-flung military superstructure and beyond. The stop-and-start nature of our economic recovery is evidence of the change as a recalibration ensues and we continually adjust to the new normal, downscaling and localizing. Elsewhere I have suggested the emergent localism as an opportunity for a more practical heroism of scale involving more face-to-face interactions. At the same time, it is potentially an opportunity for panic and terror on scale that is horrific.

When considering the potential for panic and horror I often direct my students toward reflecting on our biggest symbols, like the cross or the flag, as a way of understanding symbolic immortality and its consequences. Reflecting again, in light of Hansten’s presentation, the very complexity of our symbolic immortality system really comes to the fore. It’s not just the archetypal symbols, it’s all the little stuff we take for granted; the designer label, the exclusive membership, the branded product loyalty.  I’m guessing that this is where the early apprehension of panic will register. Ernest Becker’s ideas would be most useful at this point in the process, giving the better angels of our nature, if not ground to stand on, at least ground to hover over. Here, at the tipping point, Becker’s work could paradoxically be a life saver. The question is how to get the word out.  I have advocated getting Becker in the hands of artists, the re-mythologizers who could get these ideas to a mass audience.

While still maintaining this view, Hansten’s model provided new insight.

Since encountering Becker I’ve been frustrated, as I’m sure some of you have as well, by the difficulty of getting folks to pick up on his work. The ideas are so important, the argument so well crafted, and with the Mortality Salience Hypothesis so demonstrable, how can people not get it? The voice of Becker whispers, “Yes, Bill, it’s called denial.” Okay, I get that. Still … this is where Hansten’s model is helpful. Upon reaching peak complexity in the problem-solving process, and having done the requisite reflection, an ideational breakthrough occurs, which Hansten calls “enlightened simplification.” As it is applied in the real world, it becomes “authentic simplicity.”  What might this mean for our problem of peak oil and peak terror? It indicates that there may be a point at which a significant part of our culture of denial can give way to more authentic realism. One reason that denial is so readily available to moderns is the very complexity of our world. We can readily be the Kierkegaardian philistine, distracted by the trivial. As the nature of our predicament becomes more apparent, we will have less time for trivia and necessarily more focused on the basics. Much like the addict who must hit bottom to escape his own denial, so too our culture must sober up to move forward. Becker’s ideas, I believe, will find a more receptive audience as the crisis deepens. The power and simplicity of his core insights will match the “authentic simplicity” our earthier society will require.

h1

Your iPhone, Creaturely Motives, and Prosthetic Identity

September 18, 2012

“k1f” Kirby Farrell

When tech makes you feel superhuman.

Recently psychiatrist John Wynn posted a nifty essay about people’s passionate identification with the late Steve Jobs and the remarkable iPhone.  Here’s an excerpt:

Saying, in essence, “I revere Steve Jobs, therefore I will buy the phone he designed,” can be translated as, “my life feels fuller, more meaningful and secure, because of my affiliation with this powerful figure.” Carrying and using the device we are reminded throughout the day of our seamless participation in a world of brilliant innovation and beauty. Adding apps, chatting with Siri, video-chatting with friends and loved ones all deepens our sense of participation with Mr. Jobs, his beautiful designs, and the infinite future of technological advance and aesthetic refinement.

 The iPhone is a totem, an emblematic object of spiritual significance that conveys power and safety to the bearer. We’ve come a long way since amulets and rabbit feet warded off bad luck; now we have infinite contact with an infinite world of information, creativity and connection. New owners fondle their iPhones, show them to whoever will look, and ponder adding any of over 500,000 apps — to equipment that already just received over 200 enhancements. Perhaps the “i” in iPhone stands for “infinite,” as in the infinite pursuit of technology as an end in itself. . . . [The] passion surrounding the inventor’s death shows us that the phone is invested with much more power: it comforts and reassures us by warding off our own fears of death, and our awareness of our mortality.1

This argument explains the magic of the iphone as partly an effect of transference—hero-worship. From helpless infancy on, we’re disposed to identify with powerful figures who can protect us and fulfill our needs.  In a way, Steve Jobs has joined the “immortal” Albert Einstein as a larger-than-life and ambiguously superhuman hero.  His gizmo, the iphone, has a similar kind of special potency that makes it a “totem” or fetish, ambiguously supernatural.  As the psychiatrist reminds us, “Consciously nobody is saying to himself, ‘I bought this thing so I can live forever.’ But nevertheless, the machine can arouse feelings of special powers and confidence that makes you feel exceptional.

And exceptional is how you want to feel when you’re one of billions of bipeds under stress and wide open to the infirmities and terrors of flesh you’re heir to.

Suppose we expand on this account.  Suppose we use the iPhone to think about technology in relation to creaturely motives and prosthetic identity.

A few blogs back (“Semper Fido“) we were remarking on our peculiar vulnerability among the animals. We’re brainy but with no armor, feeble claws, prolonged helpless childhood—and we know we die. In response we find ingenious ways to magnify our capability and feel bigger than enemies and death. Among our creaturely motives are appetites for more life—more food, sex, more discoveries, more self-expansion. The marginal creature wants to be bigger and more meaningful. Feeling like a bigshot—feeling more important—promises to protect morale.  As in slang, “Keep your spirits up, big fella.”

How do we cope with these limits?

Among animals, we’re virtuoso tool-makers, continually expanding our selves through prosthetic engagement with the world.  We develop relationships which magnify our adaptive powers and symbolically make up for our creaturely limits. Your fist won’t bag a gazelle for supper, but a stick, a stone, a flint, or a bullet could feed you. The executive brain may imagine that you are your mind, and the tool is a handy external convenience. But in fact tool-use is a creaturely motive, built into us as it is in some of our primate cousins. In this sense, whatever else you are, you are your tools and tool-using motives as well.  And once you start thinking in this direction, you see that almost everything in our lives has the character of a tool, from art to theology and dandruff shampoo. You can also see that an intense identification with tools risks reducing the self to an apparatus for use. The potency of the tool can become the potency of the self, as in the fanatical attitudes of some gun owners toward their weapons.

I like the term “prosthetic” to describe our relationship to tools.  As partly symbolic creatures, we routinely imagine ourselves surpassing our actual biological limits. In this sense a tool such as the wheel is compensating for a biological lack the way an artificial limb does. It’s making up for something missing. To put it another way, our lifelong childlike flexibility as animals means that we’re  always potential as well as actual creatures. Which is another way of characterizing us as problem-solving animals.  It’s how we’re built.

Is it any wonder people identify with their iPhones? The gizmo magnifies you, and it embodies you.  It substantiates you. Let’s keep in mind that the self is not a thing.  It’s an event, and an evanescent biochemical and symbolic event at that. It’s a halo of possibilities. It can’t be weighed or X-rayed or ribbon-wrapped. As social animals, we live by continually substantiating one another. Every “Hello” corroborates that you and the other dude exist. When you press the flesh in a handshake or a hug, you’re making more real the envelope of the self. When the corroboration is really strong, fortified by endorphins and the symbolic vitamins of intimacy, you actually feel as if, in the wisdom of slang, you’re “getting real.”  Meaning, more real, more alive, bigger, pregnant with possibility. There’s more you.

Back to the iPhone. By expanding your voice over unimaginable distances, potentially everywhere, the machine puts you in the world. It may record you and your relationships with others as sound or a photo. With its ever-increasing new apps, the device even mimics our own extraordinary adaptability as animals, and our capacity for multiplicity and overlays of experience.

Critics can object that the phone is “just” electrons and facsimiles of “real life.” You’re not really nose to nose with your sweetie a thousand miles away, so don’t get carried away, pal. But the truth is, real life is also a facsimile. Again, the self is not an object but an event continually recreated in an imaginative zone of symbolic fizz and overlays of tacitness.

In the most basic sense, we live by enabling fictions. We readily invoke a “me,” but we have to keep simplifying our “life stories,” making them artificially consistent so that the self and the overwhelming world will be manageable. We continually finesse our ambivalence so we can get up in the morning and reach for a tool such as hot coffee that enables us to get started. If you’re on this expedition up da Nile, you know in the back of your mind that you have to simplify yourself and the world, but you accept this falsification because our sense of “as if” keeps us from feeling turned into a mechanism.  Like intuition, “as ifness” gives life space three dimensions and color.

With its programs and purposeful apps, the iphone suddenly appears as a fabulously ambivalent enabling fiction.  It expands you, it makes you real. It also simplifies you, making you usefully artificial as denial does. It proves you’re alive even as it potentially dissolves your voice and identity into the aether.

Modernism is a period of radical prosthetic development in human identity.  Only within the past century or so have we become creatures whose bare feet rarely if ever touch the ground; who can see inside our bodies; artificially propagate ourselves in a petri dish; walk on the moon.  In this framework our prosthetic dimension calls into question the kind of animal we are.  What is the ground of our experience?  Where does self stop and tool begin? If a house or clothes function as a prosthetic shell, where does self stop and environment begin? And since other people can extend our wills as tools do, in a host of relationships from slavery to parenting, we sometimes need to ask, Where does self leave off and other begin?2  

One of these days we can carry this investigation further by exploring how we use others as tools to form our personalities, just as we use fire and microscopes.  We think through others.  Since we’re here on da Nile, we could say that we swim in other people.

Why bother with the idea of prosthetic identity at all? Why not go with Winnicott and object relations theory, say? Or another vocabulary altogether? For me, prosthesis emphasizes identity-formation as a creative act, inherently social and systemic in its mutuality. Prosthetic relationships can provide a way to think about symbiotic qualities in family and cultural systems as well as in our psychosomatic endowment. Insofar as they invite us to think in terms of interdependent behavioral systems rather than individual conflicts, they open toward evolutionary and ethological perspectives.  They foreground concerns which are apt to be deemphasized in criticism and psychology based on intrapsychic or interpersonal conflict.

There’s a further twist worth mentioning, especially in an election season, at a stressful historical moment. Prosthetic behavior can open up perspectives beyond the melodramatic heroic rescue and victim-enemy tropes we’re given to. Listen closely enough, and you’re likely to hear chauvinism or self-concern in most accounts of our struggles. Prosthetic relationships remind us that we live in systems. Prosthetic behavior can call attention to our character as experimental, problem-solving animals continually adapting to a world which, like the swollen, organic soup of the river ahead, is bigger than we are, and always bringing more life.

1. http://www.ernestbecker.org/index.php?option=com_content&view=article&id=518:steve-jobs-death&catid=7:news-archives&Itemid=33

2. This is from my Post-Traumatic Culture: Injury and Interpretation in the 90s (1999), p.175

 This essay is cross-posted from Psychology Today.

h1

Three Scenes from a School Day

September 14, 2012

“Svaardvaard” Bill Bornschein

For this post I’d like to share three distinct  Becker-related observations from a day at school. They do not  come together to form an overarching theme; rather, they demonstrate the breadth of application that a Beckeresque  perspective provides. Alternately, they may reflect my own focus since I recently told a friend that I divide my life into pre-Becker  and post-Becker. Whichever is the case, here goes:

#1) The first scene is from the classroom and reflects “intentional Becker.”  We have been studying the nature and function of myth in culture, Joseph Campbell-type material. Looking at both religious and secular mythology, we strive to understand the conditions under which myths thrive or falter. We learn that myths are strongest when they are unconsciously assumed and weakest when they are self-consciously held at arm’s length. As Campbell and others have pointed out, times of rapid change make people aware of the myth as myth. In the words of Walter Truett Anderson, “We are not so much possessed by belief as possessors of belief.” We inhabit our traditional myths in newly self-conscious ways in our postmodern age. I express this new experience by standing in the classroom doorway, one foot in, representing living within the myth, and the other foot outside the door, representing transcending the myth. The dilemma of the self-conscious society is the dilemma of the self-conscious individual writ large: namely, how to live with integrity in the face of reality. Retreats into nihilism and tribalism are options that are clearly present. Standing in the doorway reflects another option, living with traditional myth in a new way. Consider the traditional Biblical creation account, which is geocentric and anthropocentric. Now consider the current state of knowledge about our place in the universe as reflected in the following  flash animation. http://htwins.net/scale2/?bordercolor=white   What are we supposed to do with this? How do we appropriate tradition in light of such a new perspective? As Becker provides no pat answers, neither do I. The students wrestle with new questions of meaning and tradition and the hope is that the process itself will render them more compassionate and tolerant as they understand their own myths in a less absolute way.

#2)  Even as traditional myths undergo stress in the face of rapid change, so too there are newly minted myths that largely escape observation and critique and are quite robust. Chief among these new myths are the myth of progress and the myth of the technological fix. These two myths come together in a new piece of technology that many students own, the iPad.  As I strolled through the cafeteria at lunch I noticed a table of eight or nine students sitting with their iPads, each playing a different sports game, football, auto racing, basketball and the like. The students were talking to each other even as their eyes remained riveted on their screens. Two things occurred to me. One was that I was reminded of Becker’s description of philistinism and wondered what he would think of this new technology’s power to distract and trivialize. The second was that the students offered ample evidence of how malleable we truly are. We are training our flexible minds to do new things in new ways, for better or for worse.  Some observers like Nicholas Carr, author of The Shallows, warn that the new technology is producing a jumpy human mind that now struggles with sustained focused concentration. Whether one sees the technological revolution as a blessing or a curse, the fact remains that we are remarkably adaptive. Becker warned against a New Age-style apotheosis of man, and yet it is our capacity for change that is  our wellspring of hope. We may be locked in to our mortal condition, but we do still have imagination and the capacity to change our direction, if not our fate.

#3) The third Beckerian scene is the most serendipitous. While in the teacher’s lounge I had a conversation with a young English teacher who was upset with a particular student over the creation myth he had composed. It seems the young man, in his story, had envisioned the world as created from the feces of a sacred animal. As it turned out, this was the younger brother of a student who had tried something similar a few years earlier, only making the feces from God directly. “But I didn’t make it God’s poop!” was the younger brother’s defense. The teacher saw disrespect. I saw Becker. It seemed to me that such a creation narrative was psychologically grounded in the human condition in a way that many creation myths are not. Whether or not disrespect was intended, the usage seemed pretty accurate. A quick survey of creation myths revealed what I expected, that creation from feces is a common theme from the Americas to Europe, Africa and Australia. Intuitively, the students were onto something, no shit.