Posts Tagged ‘Denial of Death’

h1

Flu Shot Resistance – A Symptom of Death Denial?

February 14, 2013
TDF Guest Scott Murray

TDF Guest Scott Murray

Cornell psychologist Thomas Gilovich has made a career out of studying the cognitive processes that sustain dubious beliefs. There is something about flu season that makes me wish I could combine his work with that of Becker and enlist their aid in a cultural intervention. It seems there is a robust resistance to influenza vaccination despite the potential risks of the flu – a subject rife with extensions into Gilovich’s work. If the flu is so dangerous, what dubious beliefs inspire resistance to vaccination? And a glance at Beckerian thought might raise the question of why the fear of death wouldn’t drive more people to get vaccinated. But a deeper understanding of what death denial actually entails helps inform us why many resist the shot.

We’ve had rather mild flu seasons the last year or so (http://news.yahoo.com/why-years-flu-season-bad-173457272.html), but as statistics readily predict, occasionally influenza strains are more potent and many cases of serious flu infection can occur. The cost in dollars – and lives – can be quite serious. So why do some resist influenza vaccination?

Certainly the media, with its drive to fill air time and sell stories that attract attention, is part of the problem. Flu hype sends millions scrambling for a vaccination each year – but the media is known for its hyperbole, slanted use of facts, and even fear-mongering. It’s no wonder, then, that many cautious, critical individuals would see reports of this year’s raging epidemic and dismiss the need for a flu shot. But as Gilovich is insistent in pointing out, we tend to be most critical of that information which does not agree with the beliefs we already possess. As he puts it: “when the initial evidence supports our preferences, we are generally satisfied and terminate our search; when the initial evidence is hostile … we often dig deeper.” (82) In other words, those who suspect media hype and doubt the need for a flu shot may already believe that a flu shot is unnecessary, and be looking for reasons to support this belief.

Is a distrust of the media, with their tendency to overstate or distort facts, sufficient to explain dismissal of flu season severity? Becker might disagree. Cognitive biases are in one respect no different than elbows and thumbs: they’ve evolved to serve a purpose, and where Gilovich leaves off, Becker nicely steps in. Gilovich is a laboratory psychologist, a statistician; he resists the temptation to speculate too grandly on why the human brain comes hard-wired to accept information that supports existing beliefs while remaining doubtful of information that does not. Becker’s focus on death anxiety provides a compelling furtherance of this line of questioning.

Disease and death are deeply connected concepts. Disease is a limiting of life and brings the carefully repressed reality of our mortality painfully to the foreground of our thoughts. It’s no surprise, then, to discover a deep desire to believe that the flu shot is essentially needless. The CDC, in addressing the most prevalent myths that prevent influenza vaccination, have to deal with the ‘argument’ that influenza is only serious for those who are very young, elderly, pregnant or otherwise already ill. Your average healthy person can deal with influenza using natural means – namely, the immune system (http://www.cnn.com/2013/01/11/health/flu-shot-questions/index.html).

The challenge with this argument is that it isn’t untrue; in fact, the majority of people who come down with influenza nowadays – certainly in the more affluent Western world — survive relatively unscathed. But the problem with the argument is that it isn’t actually an argument against vaccination at all. Just because you can survive the flu does not mean you shouldn’t avoid it, if not only for yourself than for the basic public health service of not acting as a transmitter of the virus to someone who is more at risk of hospitalization or even death due to influenza. So the question persists: if vaccination serves yourself and the community, why resist it?

There are other studies being conducted on what is termed ‘naturalness bias’ (like this one at Rutgers: http://mdm.sagepub.com/content/28/4/532.short). The result of the study is essentially that some people make bad decisions because of a cognitive bias towards means that feel more ‘natural’ – in other words, if you tell them a kind of tea is a great counteractive agent to influenza, they will drink it, but if you suggest that they get vaccinated, they will find all kinds of reasons to dispute you. Tea is from nature; vaccines are a mysterious, dubious concoction brought to you by the same science that supported cocaine as medicine and DDT as pesticide.

So all kinds of reasons exist to doubt the flu vaccine. What’s in the flu shot, anyway? Many believe that the flu shot can actually get you sick, which is itself a medical misunderstanding involving a combination of known statistical nemeses. For starters, a large number of influenza strains and influenza-like viruses exist, and the vaccine can only protect you against so many. Some argue that this makes the vaccine worthless, when in fact it means that the vaccine is only as good as the severity of the strains it actually protects against. Those concocting the vaccine work hard to ensure it protects against the most severe strains predicted to be active in any given season. Nevertheless, the likelihood that some who are vaccinated become sick soon after is statistically guaranteed given a large enough sample, either because the vaccination didn’t come in time or because those unlucky few caught something the vaccination could not defend against. To these few (a minority, particularly if you establish a clear window of time after vaccination where sickness constitutes a possible response to vaccination), it is not surprising that they feel cheated and suspect that they’ve been had — not only by flu hype, but by flu shot hype.

The other factor in play has to do with anecdotal evidence and word of mouth narrative; it’s a simple tale of how everyone knows someone (who knows someone?) who has ‘gotten sick from a flu shot.’ The CDC is very candid about why this can seem to happen to any who cares to actually read up on it: (http://www.cdc.gov/flu/about/qa/misconceptions.htm).

In the light of these cognitive factors that influence these behaviors (influenza vaccination denial and the anti-hype hype), is there a motivational determinant that can be identified? Becker’s theories outline what Gilovich’s work suggests: something at work deeper than cognitive tics, deeper than the prevalence of poor science in pop culture. Isn’t it safer for the psyche to believe that disease is less of a threat? That the body has what it needs to defend itself against death? Don’t flu severity denial – and the cognitive factors that help sustain it – work in the service of protecting the embodied self against the reality of its own fragility?

Sources:

Gilovich, Thomas. How We Know What Isn’t So: The Fallibility of Human Reason in Everyday Life. New York: Free Press, 1991.

Advertisements
h1

We Give Birth Astride a Grave

November 25, 2012

TDF Guest Kim Pereira

I’ve been fortunate never to have experienced the crippling effects of chronic depression, although I’ve had my share of despair where the world seemed to spin out of control and even sleep, if it could be had, offered no respite. Like so many others, I have known late nights sprawled on a couch, clicking the hours away, remote in hand, finding nothing worth watching, relying on endless checking of meaningless Facebook updates to authenticate the feeling that I’m not alone in this spiraling vortex. Are we that obsessed with distracting ourselves? And if so, from what, I wonder? Being alone? Not alone by ourselves, though that too, but alone with our thoughts. Do we fill our days with sidetracking paraphernalia to avoid the agony of introspection which would inevitably lead to the central question of our lives—our mortality? The consuming question is this: are our lives being driven by the specter of death? Is that behind our attempts to fill every moment with something, anything, just to avoid confronting the only truth of which we are certain? Are we trying to divert ourselves from ourselves? What fears lurk beneath the urgency to keep busy, rushing through manufactured tasks and mundane events, going to church, jamming trivia into our days as though they were somehow meaningful, bestowing significance on them by virtue of our attention, deluding ourselves by clinging to an impuissant work ethic, a hand-me-down from past generations that prized it above all else? We save very little time for vacations, have the least number of public holidays of any country, and when we do take a break we’re lucky if we find time to enjoy a sunset!

When I first read Samuel Beckett’s anguished cry (the title of this essay) from his seminal work, Waiting for Godot, the full implications of it didn’t really strike me, although it vaguely echoed Thoreau’s “lives of quiet desperation.” Since then my fascination with the so-called Theatre of the Absurd has offered insights into the world as I traveled across the globe, making more sense of what I saw than I cared to admit. Looking through literature I found those sentiments reflected in other works—the sacrifices of “scapegoated” heroic figures from Shakespearean and Greek tragedies seek transcendence from the desolation of the human condition; comedic structures attempt to reorder societies by changing the rhythms of the old world to fit the melodies of youth, exchanging a past for a new future that will in time become replete with similar characteristics that made the past so unpalatable! The tortuous cycle continues.

These depictions offer only glimpses of hope as they strain against overwhelming realities. King Lear ends in a world as hopeless and barren as can be imagined, prompting Beckett to use the king and his fool metaphorically in Endgame, that entropic image of a world we have destroyed. For all his eternal optimism, Dickens’ brilliance lies in his excoriating portrayals of industrial London where little orphans were swallowed by rapacious thugs lurking in alleys and the trammels of England’s Courts of Chancery ensnared families for decades! Any journey through his pages cannot ignore the glowering pessimism of the nineteenth century and the crushing weight of factories and machines leading to disenchantment and inertia—Chekhov’s characters are unable to rise from their indolence and go to Moscow, preferring to yearn for what will never come, for longing is their raison d’etre, replacing vibrancy with torpor, banishing the cherry orchard to the lumber yard as commercialism and utility sweep away beauty and grace, false notions anyway, having been acquired on the backs of serfs and the underclass!

The machines have changed, but the effects persist—digital technology, once hailed with the same enthusiasm as the industrial machinery of the nineteenth century, is viewed with growing suspicion as intrusive, leading to questions of privacy and loss of identity, reducing us to disembodied voices on answering machines or twitter feeds. The ability to stay in contact with a swath of people robs us of the desire to do so; if it’s always available it loses its urgency, without which we drift into isolation, cocooned by the minutiae of every day, paralyzed by the burden of trying to authenticate our existence.

We don’t purchase products any more—they buy us!! Walking through shopping aisles we are attacked by displays that scream for attention; row upon shiny row leer as we walk past, similar to walking through a jail cell corridor showered by abuse from inmates on either side. Under relentless pressure we succumb, reach out and grab one; maybe we even read the specifications on the container to delude ourselves into believing we made a wise choice. But the selection is quite random, for the plethora of available products deadens our ability to choose. With our identities eviscerated by years of slavery to the gods of marketing strategies, we don’t know who we are, much less what we really want. We are now on sale, possessed by our possessions—clothes, household products, food, holiday resorts, TV channels, movies, books, and everything else; we have to take what we are given. They choose us!

There’s a moment in Ionesco’s Bald Soprano where the Smiths are discussing Bobby Watson’s death, reported in the newspaper; as the conversation proceeds we realize every single one of Bobby Watson’s relatives is named Bobby Watson. Of course, the amusing irony is that a couple named Smith is making this discovery. Keeping up with the Joneses (or Smiths) has turned us into Joneses, with the same houses, same clothes, and the same status updates. Look through the myriad photographs on Facebook and after a while the pictures of kids, families, and vacations devolve into endless cycles of sameness. We are all Bobby Watsons, interchangeable and alike, indistinguishable from one another, with nothing meaningful to say. Our political and social parties and FB profiles are vain attempts to distinguish us but very little there is unique.

Towards the end of The Bald Soprano dialogue descends into gibberish, culminating in a show of aggression. Failure to communicate leaves violence as the only option, for we desperately need something to remind us that we’re alive, that we still live in a social world, but all we have are anger, violence, and the instinct to survive—witness global skirmishes and full-fledged wars, ethnic cleansing, government sponsored torture, terrorism, street violence, political discourse and TV discussions that are shouting matches, mass murders in villages, movie theatres, and even temples! The jungle paths to destruction have always been available, obscured sometimes by the underbrush of forced civility, but easily accessible when survival is at stake.

This is the Age of Saturn, a sullen, scowling time where malevolence permeates the republic and millions of bloggers can pen their unfiltered thoughts; where politicians lie with impunity and truth is lost among thousands of commercials; where we cling to worthless promises because we’re desperate to believe someone cares about us; a time of distrust, skepticism, and fear! Music now finds its greatest audience only through competitions and the hushed loveliness of verse has descended into poetry slams. A hundred years ago Expressionists rebelled against the dehumanizing effects of an industrial age, distorting reality to give vent to passions and feelings, releasing their creative streams unencumbered by constraints of logic or order; they themselves were victims of an emotional angst that settled upon Europe before exploding in a massive conflagration across the continent. It almost seems like we have come full circle ten decades later. In the new reality of this century, language and relationships have been compressed and distorted—140 characters (as random as the work of Dadaists) are enough to say what we feel and the social networks of cyberspace have supplanted front porches, parish halls, and even playgrounds. We can now count the number of “friends” we have right there in the left column and on the right we know exactly what we “like.” And still we are alone. Waiting…

Earlier I alluded to the fear of death being the driving force behind much of what we do, suggesting that much of what we do are merely distractions designed to keep us from contemplating the void. Perhaps the most successful thing is to make it through each day. When one considers the absolute haphazardness of life (people dying accidentally or being stricken with fatal diseases—who among us doesn’t know someone we love in this situation?) it’s a small miracle we are alive at any moment! Despite the winding down of the world in Endgame they are left with the possibility of tomorrow; Godot never comes but Everyman still waits on the empty road; we persist. The original French title of the play is En Attendant Godot, WHILE waiting for Godot! Like Beckett’s tramps we find ways to amuse ourselves while waiting for the end—we play games, make art, have sex, get drunk, persevere in our jobs, convince ourselves that death is not tomorrow, and do a million things that slap our faces to wake us to the fact that we’re alive.

Dr. Kim Pereira is Professor of Theater and Director of the Honors Program at Illinois State University.

h1

Longevity

October 27, 2012

“The Single Hound” Bruce Floyd

This article neglects to understand one of the basic tenets of Becker: increasing the longevity of life will do nothing to ameliorate the terror of the human predicament; in fact, it will exacerbate it, take anxiety to new heights, paralyze the will, make an early death even more “absurd” that one is now. This clamor, all this hue and cry, for the extension of life confirms Becker’s insights. The length of life means nothing to the self-conscious creature. All it knows it that he or she will die. What difference does it make whether it’s seventy years of one-hundred and fifty?

The problems that plague humanity, the existential ones, are not to be solved by insuring people they will live longer. No, what will console the mortal is some way of accepting the limitations of life, of coming to terms with life and its limitations. We must give in, even embrace, our fate. It’s hard to do. Of course it is, and that’s why we swim in an ocean of illusions.

The coming age of longevity will not change everything; it will just make the time-immemorial paradox more acute and baffling.

 

h1

Peak Complexity, Peak Oil, Peak Terror

October 17, 2012

“Svaardvaard” Bill Bornschein

I was fortunate to hear University of Washington professor Phillip Hansten’s talk at the recent EBF Fall Conference. His topic was what he refers to as “premature factulation,” which he defines as “the process of coming to conclusions without adequate study or contemplation; usually applied to complex concepts or situations.” The diagram below represents the basic dynamic.

Premature Factulation

Hansten’s book Premature Factulation provides many examples of this process as well as prescriptions for dealing more successfully with complex issues. Peppered with quotes from great thinkers across the ages, this book has the synthetic feel of Becker’s work and I recommend it.

As I looked at the Hansten diagram, M. King Hubbert’s famous oil depletion curve appeared in my mind’s eye. Hubbert’s Peak, as it is called, explains that the rate at which oil can be extracted follows a Bell curve and further, calculates that such a peak has already been reached. Indeed, global production of oil reached its apex in 2005. The economic downturn that followed shortly thereafter reflects our inability to project future growth based on cheap energy. Economic contraction is the new reality, regardless of which political party is victorious or what the consumers want. The laws of physics seem to stand quite independent of human desire.

As I compared the two models, the first thing that struck me was the role of complexity in the first half of the curve. The dramatic growth curve that has accompanied the age of oil really took off in the post-WWII era with the Green Revolution quadrupling global population and the advent of modern consumer culture more than quadrupling our, for lack of a better term, stuff. Indeed, our Apollonian ascent, which has provided the basic template for progress and heroism, has been fueled by vast amounts of cheap, easy-to-get oil. As we pass through this gate of history, we bear witness to the consequences of the collapsing hyper-complexity, from our undecipherable financial instruments to our interconnected food and energy delivery systems, to our far-flung military superstructure and beyond. The stop-and-start nature of our economic recovery is evidence of the change as a recalibration ensues and we continually adjust to the new normal, downscaling and localizing. Elsewhere I have suggested the emergent localism as an opportunity for a more practical heroism of scale involving more face-to-face interactions. At the same time, it is potentially an opportunity for panic and terror on scale that is horrific.

When considering the potential for panic and horror I often direct my students toward reflecting on our biggest symbols, like the cross or the flag, as a way of understanding symbolic immortality and its consequences. Reflecting again, in light of Hansten’s presentation, the very complexity of our symbolic immortality system really comes to the fore. It’s not just the archetypal symbols, it’s all the little stuff we take for granted; the designer label, the exclusive membership, the branded product loyalty.  I’m guessing that this is where the early apprehension of panic will register. Ernest Becker’s ideas would be most useful at this point in the process, giving the better angels of our nature, if not ground to stand on, at least ground to hover over. Here, at the tipping point, Becker’s work could paradoxically be a life saver. The question is how to get the word out.  I have advocated getting Becker in the hands of artists, the re-mythologizers who could get these ideas to a mass audience.

While still maintaining this view, Hansten’s model provided new insight.

Since encountering Becker I’ve been frustrated, as I’m sure some of you have as well, by the difficulty of getting folks to pick up on his work. The ideas are so important, the argument so well crafted, and with the Mortality Salience Hypothesis so demonstrable, how can people not get it? The voice of Becker whispers, “Yes, Bill, it’s called denial.” Okay, I get that. Still … this is where Hansten’s model is helpful. Upon reaching peak complexity in the problem-solving process, and having done the requisite reflection, an ideational breakthrough occurs, which Hansten calls “enlightened simplification.” As it is applied in the real world, it becomes “authentic simplicity.”  What might this mean for our problem of peak oil and peak terror? It indicates that there may be a point at which a significant part of our culture of denial can give way to more authentic realism. One reason that denial is so readily available to moderns is the very complexity of our world. We can readily be the Kierkegaardian philistine, distracted by the trivial. As the nature of our predicament becomes more apparent, we will have less time for trivia and necessarily more focused on the basics. Much like the addict who must hit bottom to escape his own denial, so too our culture must sober up to move forward. Becker’s ideas, I believe, will find a more receptive audience as the crisis deepens. The power and simplicity of his core insights will match the “authentic simplicity” our earthier society will require.

h1

Your iPhone, Creaturely Motives, and Prosthetic Identity

September 18, 2012

“k1f” Kirby Farrell

When tech makes you feel superhuman.

Recently psychiatrist John Wynn posted a nifty essay about people’s passionate identification with the late Steve Jobs and the remarkable iPhone.  Here’s an excerpt:

Saying, in essence, “I revere Steve Jobs, therefore I will buy the phone he designed,” can be translated as, “my life feels fuller, more meaningful and secure, because of my affiliation with this powerful figure.” Carrying and using the device we are reminded throughout the day of our seamless participation in a world of brilliant innovation and beauty. Adding apps, chatting with Siri, video-chatting with friends and loved ones all deepens our sense of participation with Mr. Jobs, his beautiful designs, and the infinite future of technological advance and aesthetic refinement.

 The iPhone is a totem, an emblematic object of spiritual significance that conveys power and safety to the bearer. We’ve come a long way since amulets and rabbit feet warded off bad luck; now we have infinite contact with an infinite world of information, creativity and connection. New owners fondle their iPhones, show them to whoever will look, and ponder adding any of over 500,000 apps — to equipment that already just received over 200 enhancements. Perhaps the “i” in iPhone stands for “infinite,” as in the infinite pursuit of technology as an end in itself. . . . [The] passion surrounding the inventor’s death shows us that the phone is invested with much more power: it comforts and reassures us by warding off our own fears of death, and our awareness of our mortality.1

This argument explains the magic of the iphone as partly an effect of transference—hero-worship. From helpless infancy on, we’re disposed to identify with powerful figures who can protect us and fulfill our needs.  In a way, Steve Jobs has joined the “immortal” Albert Einstein as a larger-than-life and ambiguously superhuman hero.  His gizmo, the iphone, has a similar kind of special potency that makes it a “totem” or fetish, ambiguously supernatural.  As the psychiatrist reminds us, “Consciously nobody is saying to himself, ‘I bought this thing so I can live forever.’ But nevertheless, the machine can arouse feelings of special powers and confidence that makes you feel exceptional.

And exceptional is how you want to feel when you’re one of billions of bipeds under stress and wide open to the infirmities and terrors of flesh you’re heir to.

Suppose we expand on this account.  Suppose we use the iPhone to think about technology in relation to creaturely motives and prosthetic identity.

A few blogs back (“Semper Fido“) we were remarking on our peculiar vulnerability among the animals. We’re brainy but with no armor, feeble claws, prolonged helpless childhood—and we know we die. In response we find ingenious ways to magnify our capability and feel bigger than enemies and death. Among our creaturely motives are appetites for more life—more food, sex, more discoveries, more self-expansion. The marginal creature wants to be bigger and more meaningful. Feeling like a bigshot—feeling more important—promises to protect morale.  As in slang, “Keep your spirits up, big fella.”

How do we cope with these limits?

Among animals, we’re virtuoso tool-makers, continually expanding our selves through prosthetic engagement with the world.  We develop relationships which magnify our adaptive powers and symbolically make up for our creaturely limits. Your fist won’t bag a gazelle for supper, but a stick, a stone, a flint, or a bullet could feed you. The executive brain may imagine that you are your mind, and the tool is a handy external convenience. But in fact tool-use is a creaturely motive, built into us as it is in some of our primate cousins. In this sense, whatever else you are, you are your tools and tool-using motives as well.  And once you start thinking in this direction, you see that almost everything in our lives has the character of a tool, from art to theology and dandruff shampoo. You can also see that an intense identification with tools risks reducing the self to an apparatus for use. The potency of the tool can become the potency of the self, as in the fanatical attitudes of some gun owners toward their weapons.

I like the term “prosthetic” to describe our relationship to tools.  As partly symbolic creatures, we routinely imagine ourselves surpassing our actual biological limits. In this sense a tool such as the wheel is compensating for a biological lack the way an artificial limb does. It’s making up for something missing. To put it another way, our lifelong childlike flexibility as animals means that we’re  always potential as well as actual creatures. Which is another way of characterizing us as problem-solving animals.  It’s how we’re built.

Is it any wonder people identify with their iPhones? The gizmo magnifies you, and it embodies you.  It substantiates you. Let’s keep in mind that the self is not a thing.  It’s an event, and an evanescent biochemical and symbolic event at that. It’s a halo of possibilities. It can’t be weighed or X-rayed or ribbon-wrapped. As social animals, we live by continually substantiating one another. Every “Hello” corroborates that you and the other dude exist. When you press the flesh in a handshake or a hug, you’re making more real the envelope of the self. When the corroboration is really strong, fortified by endorphins and the symbolic vitamins of intimacy, you actually feel as if, in the wisdom of slang, you’re “getting real.”  Meaning, more real, more alive, bigger, pregnant with possibility. There’s more you.

Back to the iPhone. By expanding your voice over unimaginable distances, potentially everywhere, the machine puts you in the world. It may record you and your relationships with others as sound or a photo. With its ever-increasing new apps, the device even mimics our own extraordinary adaptability as animals, and our capacity for multiplicity and overlays of experience.

Critics can object that the phone is “just” electrons and facsimiles of “real life.” You’re not really nose to nose with your sweetie a thousand miles away, so don’t get carried away, pal. But the truth is, real life is also a facsimile. Again, the self is not an object but an event continually recreated in an imaginative zone of symbolic fizz and overlays of tacitness.

In the most basic sense, we live by enabling fictions. We readily invoke a “me,” but we have to keep simplifying our “life stories,” making them artificially consistent so that the self and the overwhelming world will be manageable. We continually finesse our ambivalence so we can get up in the morning and reach for a tool such as hot coffee that enables us to get started. If you’re on this expedition up da Nile, you know in the back of your mind that you have to simplify yourself and the world, but you accept this falsification because our sense of “as if” keeps us from feeling turned into a mechanism.  Like intuition, “as ifness” gives life space three dimensions and color.

With its programs and purposeful apps, the iphone suddenly appears as a fabulously ambivalent enabling fiction.  It expands you, it makes you real. It also simplifies you, making you usefully artificial as denial does. It proves you’re alive even as it potentially dissolves your voice and identity into the aether.

Modernism is a period of radical prosthetic development in human identity.  Only within the past century or so have we become creatures whose bare feet rarely if ever touch the ground; who can see inside our bodies; artificially propagate ourselves in a petri dish; walk on the moon.  In this framework our prosthetic dimension calls into question the kind of animal we are.  What is the ground of our experience?  Where does self stop and tool begin? If a house or clothes function as a prosthetic shell, where does self stop and environment begin? And since other people can extend our wills as tools do, in a host of relationships from slavery to parenting, we sometimes need to ask, Where does self leave off and other begin?2  

One of these days we can carry this investigation further by exploring how we use others as tools to form our personalities, just as we use fire and microscopes.  We think through others.  Since we’re here on da Nile, we could say that we swim in other people.

Why bother with the idea of prosthetic identity at all? Why not go with Winnicott and object relations theory, say? Or another vocabulary altogether? For me, prosthesis emphasizes identity-formation as a creative act, inherently social and systemic in its mutuality. Prosthetic relationships can provide a way to think about symbiotic qualities in family and cultural systems as well as in our psychosomatic endowment. Insofar as they invite us to think in terms of interdependent behavioral systems rather than individual conflicts, they open toward evolutionary and ethological perspectives.  They foreground concerns which are apt to be deemphasized in criticism and psychology based on intrapsychic or interpersonal conflict.

There’s a further twist worth mentioning, especially in an election season, at a stressful historical moment. Prosthetic behavior can open up perspectives beyond the melodramatic heroic rescue and victim-enemy tropes we’re given to. Listen closely enough, and you’re likely to hear chauvinism or self-concern in most accounts of our struggles. Prosthetic relationships remind us that we live in systems. Prosthetic behavior can call attention to our character as experimental, problem-solving animals continually adapting to a world which, like the swollen, organic soup of the river ahead, is bigger than we are, and always bringing more life.

1. http://www.ernestbecker.org/index.php?option=com_content&view=article&id=518:steve-jobs-death&catid=7:news-archives&Itemid=33

2. This is from my Post-Traumatic Culture: Injury and Interpretation in the 90s (1999), p.175

 This essay is cross-posted from Psychology Today.

h1

Three Scenes from a School Day

September 14, 2012

“Svaardvaard” Bill Bornschein

For this post I’d like to share three distinct  Becker-related observations from a day at school. They do not  come together to form an overarching theme; rather, they demonstrate the breadth of application that a Beckeresque  perspective provides. Alternately, they may reflect my own focus since I recently told a friend that I divide my life into pre-Becker  and post-Becker. Whichever is the case, here goes:

#1) The first scene is from the classroom and reflects “intentional Becker.”  We have been studying the nature and function of myth in culture, Joseph Campbell-type material. Looking at both religious and secular mythology, we strive to understand the conditions under which myths thrive or falter. We learn that myths are strongest when they are unconsciously assumed and weakest when they are self-consciously held at arm’s length. As Campbell and others have pointed out, times of rapid change make people aware of the myth as myth. In the words of Walter Truett Anderson, “We are not so much possessed by belief as possessors of belief.” We inhabit our traditional myths in newly self-conscious ways in our postmodern age. I express this new experience by standing in the classroom doorway, one foot in, representing living within the myth, and the other foot outside the door, representing transcending the myth. The dilemma of the self-conscious society is the dilemma of the self-conscious individual writ large: namely, how to live with integrity in the face of reality. Retreats into nihilism and tribalism are options that are clearly present. Standing in the doorway reflects another option, living with traditional myth in a new way. Consider the traditional Biblical creation account, which is geocentric and anthropocentric. Now consider the current state of knowledge about our place in the universe as reflected in the following  flash animation. http://htwins.net/scale2/?bordercolor=white   What are we supposed to do with this? How do we appropriate tradition in light of such a new perspective? As Becker provides no pat answers, neither do I. The students wrestle with new questions of meaning and tradition and the hope is that the process itself will render them more compassionate and tolerant as they understand their own myths in a less absolute way.

#2)  Even as traditional myths undergo stress in the face of rapid change, so too there are newly minted myths that largely escape observation and critique and are quite robust. Chief among these new myths are the myth of progress and the myth of the technological fix. These two myths come together in a new piece of technology that many students own, the iPad.  As I strolled through the cafeteria at lunch I noticed a table of eight or nine students sitting with their iPads, each playing a different sports game, football, auto racing, basketball and the like. The students were talking to each other even as their eyes remained riveted on their screens. Two things occurred to me. One was that I was reminded of Becker’s description of philistinism and wondered what he would think of this new technology’s power to distract and trivialize. The second was that the students offered ample evidence of how malleable we truly are. We are training our flexible minds to do new things in new ways, for better or for worse.  Some observers like Nicholas Carr, author of The Shallows, warn that the new technology is producing a jumpy human mind that now struggles with sustained focused concentration. Whether one sees the technological revolution as a blessing or a curse, the fact remains that we are remarkably adaptive. Becker warned against a New Age-style apotheosis of man, and yet it is our capacity for change that is  our wellspring of hope. We may be locked in to our mortal condition, but we do still have imagination and the capacity to change our direction, if not our fate.

#3) The third Beckerian scene is the most serendipitous. While in the teacher’s lounge I had a conversation with a young English teacher who was upset with a particular student over the creation myth he had composed. It seems the young man, in his story, had envisioned the world as created from the feces of a sacred animal. As it turned out, this was the younger brother of a student who had tried something similar a few years earlier, only making the feces from God directly. “But I didn’t make it God’s poop!” was the younger brother’s defense. The teacher saw disrespect. I saw Becker. It seemed to me that such a creation narrative was psychologically grounded in the human condition in a way that many creation myths are not. Whether or not disrespect was intended, the usage seemed pretty accurate. A quick survey of creation myths revealed what I expected, that creation from feces is a common theme from the Americas to Europe, Africa and Australia. Intuitively, the students were onto something, no shit.

h1

“Post-Birth” Abortion?

September 7, 2012

“Normal Dan” Dan Liechty

There has been a lot of media hype this summer over an article published last February in The Journal of Medical Ethics. http://jme.bmj.com/content/early/2012/03/01/medethics-2011-100411.full  Supposedly, the article makes an argument for “post-birth abortion,” that is, allowing legal infanticide for an unspecified period of time after a child is born, giving the parents a bit of extra time to decide if they really want this child or not. The argument is that this is only taking the justifications for pre-birth abortions one step farther and that all important moral justifications for pre-birth abortions apply equally to post-birth abortions, since no clear moral line of distinction can be made between a fetus about to be born and that same fetus some minutes or hours or days later after birth.

I first heard about this article on an ethics discussion list last winter. Most people thought it was a hoax. But the authors and the journal present it as a serious article, and over the last few months I have been “confronted” with this article numerous times by people claiming it is the “logical extension” of pro-choice thinking. Feeling perhaps a bit goaded, as well as enjoying a bit more free time this summer than I usually have, I finally actually read the article.

The authors, Alberto Giubilini and Francesca Minerva, summarize what they see as the central moral arguments supporting abortion, noting that the thread that runs through them all is that they all place the convenience of the adults above the good of the fetus. They then engage in an “if…then…” thought experiment. They say that “if…” arguments for the legitimacy of abortion based on what is convenient for the adults only are morally valid, “then…” there is no moral difference between pre-birth abortion and after-birth infanticide.

They are not (I don’t think) advocating infanticide, but playing the Devil’s Advocate role concerning arguments for the legitimacy of pre-birth abortion set forward by others (that they play the role so well is the source of confusion about their intentions and why one can easily read them as actually advocating legal infanticide.)

Now, if it were true that abortion rights supporters draw solely or even mainly on what is convenient for adults as the basis for their moral reasoning, this article would have a lot of force behind it. In fact, it probably would have been already written by someone else years ago. But that is only very peripheral to the moral reasoning of those who support policies of choice in specific circumstances (it is, in fact, the perspective on abortion advanced by those who want to proscribe it entirely.)

The central pillar of pro-choice moral argument is the woman’s right to bodily integrity, plain and simple. This is why distinguishing between stages of fetal pre- and post-viability is of crucial importance in reasoning toward moral support for policies allowing for abortion in specific circumstances. Pre-viability, we are looking at a circumstance in which assertion of the woman’s right to bodily integrity (autonomy) must be protected above fetal right to existence, because the existence of the fetus is completely dependent on the will of the woman to continue the pregnancy. Post-viability, that is no longer the case, and therefore once viability has been reached, the right of the fetus to existence must be weighed much more equally with the woman’s right of bodily integrity.

Even once viability has been reached, in my view, the decision to continue the pregnancy is still strongly that of the woman herself, but the right of the fetus to existence increases along with the length of the pregnancy. Regardless, there is a crucial moral difference between pre-viability and post-viability. For the most part, we have set viability at about the end of the second term of pregnancy (approx. 6 months.) Increased medical technology is constantly pushing that backward into the 5th month, but there are certainly limits as to how far this can or (morally speaking) even should be pushed.

The authors of this article totally ignore and dismiss viability as a moral watershed in pregnancy termination, which is certainly why 99% of medical ethicists who read the article initially thought it was a hoax of some kind. It turns out it wasn’t a hoax, but it certainly was laughably poor, scholarship–about like someone making an argument about how high humans can jump but neglecting to take gravity into account. While “if…then” thought experiment has a place in academic discourse, no serious ethicist will have learned anything from this article about actual policy.

One thing I hope we have learned from this episode is that with the advent of the internet, articles of this type, on issues related to “culture war” hot buttons, are bound to be hyped and exploited far beyond the intentions of the authors or the journal editors. Much more cautious hesitation about publishing such articles is in order.