Sunday, November 27, 2011

Now in November

In Pilgrim at Tinker Creek (1974), Annie Dillard writes, "This year I want to stick a net into time and say 'now,' as men plant flags on the ice and snow and say, 'here'(75).

If only it were that easy.

Dillard spends much of her time describing and analyzing the natural world in prose that is beautiful, fluid, poetic. Thoreau was nature's philosopher; Dillard is her prose-poet.

Pilgrim at Tinker Creek is thus as much about seeing as it is about writing and feeling.
So I blurred my eyes and gazed towards the brim of my hat and saw a new world. I saw the pale white circles roll up, roll up, like the world's turning, mute and perfect, and I saw the linear flashes, gleaming silver, like stars being born at random down a rolling scroll of time. Something broke and something opened. I filled up like a new wineskin. I breathed an air like light; I saw a light like water. I was the lip of a fountain the creek filled forever; I was ether, the leaf in the zephyr; I was flesh-flake, feather, bone.(34)
I went to the beach yesterday, because I always find the best shells during low-tide in winter. And there's almost never anyone there.

I like shells with color and whorls, pinks and purples and blues. I like the precision of the little ones.

People think you can't find any interesting ones on the beaches in RI, but I think maybe the people who can't find them are the ones who aren't interesting, because they aren't seeing them.

I brought one home last night that is one of my favorite types, and when I looked at it this morning and dug the sand out, there was another tiny shell just like it hidden inside.

It's the same with the light and the sound: you never know what the light at the beach will be like, or how the water will sound when you stop and listen to it.

Sometimes you go to the beach in November and expect it to be gray and sad, the end of a season, but it's more beautiful in winter's light than it has been all summer long.

The beach in the summer is a feeling and a smell, but the beach in November is a sight and a sound.

Dillard tells of how the Greek writer, Nikos Kazantzakis, once said,
My father gave me a canary and a revolving globe ... I used to open the cage and let the canary go free. It developed the habit of sitting at the very top of the globe and singing for hours. For years, as I wandered insatiably over the earth, greeting and taking leave of everything, I felt that the top of my head was the globe and a canary sat perched on the top of my mind, singing.
Dillard concludes,
Cruelty is a mystery, and the waste of pain. But if we describe a world to compass these things, a world that is a long, brute game, then we bump against another mystery: the inrush of power and light, the canary that sings on the skull. (9)
What would the world sound like if we let our mind-canaries sing?

What would the world look like if we see ourselves as flesh-flake, feather, bone, born at random down a rolling scroll of time?

Saturday, November 26, 2011

Smallpox, Then and Now

As one of the requirements for my eighteenth-century literature course, students have to give presentations on various historical and cultural topics. One such topic is Edward Jenner and smallpox.

If you don't know the story of smallpox and its eradication, it's both pretty amazing and downright horrifying. In The Demon in the Freezer (2002), Richard Preston describes the symptoms:
The pustules began to touch one another, and finally they merged into confluent sheets that covered his body, like a cobblestone street. The skin was torn away from its underlayers across much of his body, and the pustules on his face combined into a bubbled mass filled with fluid, until the layers of skin of his face essentially detached from its underlayers and became a bag surrounding the tissues of his head. His tongue, gums and hard palate were studded with pustules, yet his mouth was dry, and he could barely swallow. The virus had stripped the skin off his body, both inside and out, and the pain would have seemed almost beyond the capacity of human nature to endure. (35)
Eventually, the virus affects the messenger molecules of the victim's immune system by triggering unknown proteins that cause what's known as a "cytokine storm." Instead of attacking the invader, the person's immune system backfires on itself and cannot combat the massive viral infection.

This is what smallpox looks like, in its most common form.

I apologize if you find the picture upsetting, obviously, but you should know that this isn't actually the worst form of it, it's actually the "ordinary" form of variola.

It has a mortality rate of approximately 30% and, as you may imagine, it often leaves disfiguring scars on the victim's face. It can also cause blindness.

The more severe form, hemorrhagic smallpox, was often known as the "black pox." The victim's skin doesn't blister; instead, bleeding occurs under the skin, eventually causing the person to appear blackened.

One of the early signs of hemorrhagic smallpox is bleeding in the whites of the eyes: the victims' eyes often turn a deep red color and, if they survive long enough, they too turn black.

They bleed from the nose, mouth, rectum, urinary tract and vagina.

The mortality rate for hemorrhagic smallpox is nearly 100%. In all cases of smallpox, victims remain conscious and cognizant of what is happening to them throughout the duration of the illness, in "a peculiar state of apprehension and mental alertness that were said to be unlike the manifestations of any other disease." (Smallpox and Its Eradication)

There is no cure for smallpox, in any form. The only hope of prevention and eradication lies in vaccination.

In 1796, Edward Jenner developed the vaccine for smallpox, using methods that make Michael Jackson's personal physician look like a candy-striper. He noticed that women who worked on dairy farms often contracted what is known as "cowpox" (similar to smallpox, but it infects cows, obviously).

Jenner noticed that, when the women contracted cowpox, it was not as severe and they typically didn't die (unlike the cows). He also noticed that, having survived cowpox, the women usually didn't subsequently contract smallpox.

So one fine day, like any good eighteenth-century doctor, he scraped the pus from the pustules of his patient, Sarah Nelmes, who had contracted cowpox from her cow, Blossom, and he injected it into the arms of his gardener's eight-year-old son, James Phipps.

What are servants and their children for, after all, if not to conduct highly dangerous and unethical experiments on them? (And if you think it doesn't happen in our more civilized era, look into the practices of Jonas Salk in the 1940s.)

A few months later, Jenner injected James Phipps with a controlled amount of smallpox, a practice known as "variolation." In 1796, this was the only proven method of inducing immunity and, as you can imagine, it was highly risky.

But James Phipps, having been injected with cowpox, never contracted smallpox, even when he was subsequently injected with it (repeatedly).

And thus, vaccination was born.

Since simply mentioning the word "vaccination" will often trigger cries of "autism," I should point out that people who allege a connection between vaccines and autism are misunderstanding several things about the way the human body and its immune system works.

More importantly, they are overlooking the fact that the connection between the measles mumps and rubella (MMR) vaccine, inflammatory bowel disease, and autism was merely suggested by the British physician Andrew Wakefield and thirteen colleagues, in a study published in the Lancet in 1998.

Even Wakefield acknowledged that the connection was never actually, scientifically proven.

Moreover, subsequent studies were never able to replicate Wakefield's claims. In science, this is the acid test for fraudulent research.

In 2004, British journalist Brian Deer discovered that Wakefield and his colleagues had received undisclosed funds prior to the publication of the 1998 study. 10 of the 13 scientists involved in the study subsequently acknowledged the conflict of interest and retracted their claims.

Their research had in fact been funded by a lawyer representing the parents of the children with autism who had, in turn, been the subjects of the study: they were contemplating a lawsuit against the makers of the MMR vaccine, but they required evidence of a possible connection between the vaccine and autism, since none existed.

It was also revealed that Wakefield had previously applied for a patent for a single-jab measles vaccine that would also treat inflammatory bowel disease and autism.

Not only was this proposed vaccine in competition with the MMR combination vaccine, it alleges a connection between measles, inflammatory bowel disease and autism in June of 1997 and indicates that Wakefield had applied for a patent as early as March 1996, in which he also alleges a connection between the three.

Ultimately, it was also discovered that, in pursuing his study, Wakefield had performed unnecessary and invasive medical procedures on the children without proper authorization by an IRB (Institutional Review Board).

As a result of his actions, Wakefield is no longer licensed to practice medicine in the UK, and he has never held a license to practice in the US. But yes, he has a celebrity following in the US, including Jenny McCarthy.

The claims that vaccines containing thimerosal (a preservative used in cosmetics beginning in the 1930's) are linked to the rise in autism have been equally unsupported.

On the heels of public concern sparked in 1999, thimerosal was removed from vaccines in 2001. Nevertheless, the incidence of autism has not declined since 2001.

Wakefield claims it's all a huge conspiracy by Big Pharm (no surprise there). But although I'm not the "oh-just-stick-out-your-arm-and-trust-the-government" type, I can't say I'll ever put much faith in anything that guy has to say, given the findings.

And smallpox has been back in the news this week. According to a November 13th article in The Los Angeles Times, the Obama administration recently awarded a $443 million no-bid contract for a smallpox drug that is, by many accounts, untestable and potentially ineffective.

As reported in the LA Times this week, Senator Claire McCaskill (D-MO) is questioning the awarding of the contract to Siga Technologies, Inc..

As the article indicates, the controlling shareholder of Siga Technologies is Ronald O. Perelman, a major donor to the Democratic Party.

ST-246, as the drug is known, is not actually a vaccine for smallpox, per se. It is an antiviral medication designed to be administered along with the smallpox vaccine, for individuals who have compromised immune systems or who are at risk of developing "adverse reactions" as a result of receiving the smallpox vaccine. For individuals who may already have been exposed to smallpox, it may inhibit the growth of the virus, if it is administered within four days of exposure, and it may serve as a form of treatment for individuals who have become symptomatic.

If you're wondering why we can't simply cure smallpox, consider this fact: the AIDS virus has approximately 10 genes.

Smallpox has approximately 200.

Whatever they come up with, it had better damn well work or we'll all be wanting more than our money back.

Thursday, November 24, 2011

A Different Kind of Thankfulness

We wake, if we ever wake at all, to mystery, rumors of death, beauty, violence. ... "Seem like we're just set down here," a woman said to me recently, "and don't nobody know why."
--Annie Dillard, Pilgrim at Tinker Creek
I've been thinking a lot today about how last Thanksgiving--and last Christmas--were the last holidays I spent with my little friend and godson Ezra. 

I think we all knew it at the time, but it just isn't something you can allow yourself to think about, that the person you're sitting with will die before another year is over.

Actually, it isn't something you can think about, because it just seems so inconceivable, when it's someone you love, and when it's a child, the incomprehensible is compounded.

I remember when I went to visit last Thanksgiving, Ezra wanted to show me his rock collection and his coin collection. We sat and looked over them all, and he talked about where he had gotten each one and why he liked it.

One day, he hadn't been feeling very well all day, so he was lying down most of the time. Everyone else went to a Cub Scout meeting, but I stayed at the house so Ezra didn't have to go.

All of a sudden, he popped out of his bedroom wrapped in a blanket and said, "Missy, I'm going to make you my famous cheese tacos. They're the best. Sam and Circe always want me to make them. You're gonna love them."

When he didn't feel well, he'd say, "Missy, what's the funniest thing your kitty has ever done? Tell me stories about your kitty." And I could always make him laugh.

I read The Chronicles of Narnia to him over and over: that had been my present to him one year, the complete set of all of the books.

When he was sick, he'd say in a quiet little voice, "Missy, can you read Narnia?" I read myself hoarse more than once last year.

The day before he died, I went to the hospital to see him, and I tried to read to him from Narnia again, the way I always had, but I just couldn't do it.

That was, quite simply, the worst day of my life. It just didn't seem fair... well, really, it wasn't fair, actually, that a little boy would have to suffer like that and that he would know he was dying and all of us, so much older, would grow still older without him.

I think about thankfulness very differently now than I did five years ago. I'm thankful for all of the little moments that I had with all of the people I care about, and that I didn't ever squander them.

I'm thankful that I've loved the people I've loved, and that I've always let them know it, and that if it seemed like a petty problem or quarrel would come between us, I always tried to work things out so that it wouldn't.

Even when things didn't work out, I'm thankful that I was strong enough to be able to do what I did and try what I tried. I'm thankful that when it was possible, I've erred on the side of kindness, and when it wasn't possible, that I never resorted to cruelty to save my own ego.

I'm thankful that, at 43, I have no regrets, and that I can be who I am and how I am and find a way to be comfortable in the space that I'm in, even when that (mental, physical, emotional) space isn't what I would have ever wanted or chosen for myself.

When I face a problem in my life now, I ask my dad and my mom and Ezra what they think I should do, and no sooner are the words out of my mouth than I know exactly what I need to do. When people insult me or are unkind, it doesn't really matter anymore, not the way it did a year ago. Those things simply don't hurt anymore, because I've experienced far more complicated forms of pain.

I'm thankful I know what's important in my life and in the way that I live it.

There's a passage from The Chronicles of Narnia that I read to Ezra more than once. It's from The Horse and His Boy. After escaping to what he thinks will be safety, the boy Shasta learns that he needs to ride on and warn the others that an army is coming.

I thought about these sentences a lot this year.
“Shasta’s heart fainted at these words for he felt he had no strength left. And he writhed inside at what seemed the cruelty and unfairness of the demand. He had not yet learned that if you do one good deed your reward usually is to be set to do another and harder and better one.”

Wednesday, November 23, 2011

Extinction Rider Roll Call (Dicks Amendment)

Extinction Rider Roll Call (Dicks Amendment)

Ordinary Instants & Everyday Accidents

My conference paper proposal was accepted, so I'll be presenting at Brown in the spring. (Yay!)

I thought I'd blog about my (still very rough) ideas a bit, since my proposal basically builds on and merges ideas from previous posts.

In The Year of Magical Thinking (2006), Joan Didion begins her memoir of a difficult and devastating year of her life by recording the cryptic comments that she typed in the days after her husband's sudden death from a heart attack:
Life changes fast. Life changes in the instant. You sit down to dinner and life as you know it ends. The question of self-pity. (3, italics in original)
In composing this sentence, Didion acknowledges that she decided to leave out the phrase "the ordinary instant" to describe this life-changing moment, opting instead for "the instant," plain and simple.

There would be no forgetting it, she realizes, since
[i]t was in fact the ordinary nature of everything preceding the event that prevented me from truly believing it had happened, absorbing it, incorporating it, getting past it... confronted with sudden disaster we all focus on how unremarkable the circumstances were in which the unthinkable occurred. (4)
Similarly, in his memoir, Moving Violations (1995), as he reflects on the car accident that, at age 19, left him paralyzed from the chest down, journalist John Hockenberry notes that
[u]ntil the car accident that day in 1976 I understood the world only as an evolving landscape of clockwork challenges and gradual change... The upheavals of radical change and quantum unpredictability were taught to me as aberrations, deviations from the essential orderliness of the system, failures. (24)
"Sudden disaster," "the unthinkable" and "radical change" are supposed to be "aberrations," events out of the ordinary. In effect, the events of their lives leave both Didion and Hockenberry to wonder, how could the accidental be so unexpected? How could something so life-changing not herald its own arrival?

Why weren't they warned, somehow?

For his part, Hockenberry will assert that ultimately, "[i]t is a gift to learn the fabric of unpredictability" (24). In the years following his accident, Hockenberry comes to adopt what he identifies as a "quantum view of disability":
The quantum theory of disability allows you to dare to think that you can have lived two lives, two bodies occupying two places at once. Suddenly, in an instant, radical change: I was different, yet I was still the same person. I knew that was possible then. It would take a lifetime to be sure (25)
For Hockenberry, this quantum perspective assumes that "[t]he capacity to wonder is the gift itself," and that it is out of this gift of wonder that meaning can be both imagined and lived (25).

In fact, Hockenberry will assert, "[i]t is all we really had, even during all of those moments of human history when we thought we knew everything" (25).

What I want to do is examine Hockenberry's notion of the quantum view of disability as a way of making sense of the relationship between the accidental ("the upheavals of radical change," "sudden disaster" and unpredictability) and human identity.

What is the relationship between the "ordinary instants" that radically and unpredictably change our lives and our sense of ourselves as meaningfully embodied individuals?

In particular, I want to tease out this question by contrasting Hockenberry and Didion's perspectives on the ordinariness of the accidental with its representation in a very different text of narrative trauma, Shalamov's Kolyma Tales.

Written in the 1950's, Shalamov's short stories and sketches detail the more than 15 years that he spent in various Soviet labor camps in and around Kolyma and Magadan. Not exclusively autobiographical, the Kolyma tales blend fiction and non-fiction in order to bear witness to one man's experience of gruesome historical facts.

In so doing, Shalamov's narratives question the role that the accidental plays in the making of meaning--both human and narrative.

In many of the Kolyma stories, otherwise neutral accounts of seemingly trivial details of prison life (many of which are themselves inherently shocking), are typically related with an offhanded indifference and nonchalance, only to be suddenly and brutally interrupted by an incident of senseless violence--or unforeseen luck.

And then the indifference, the nonchalance and the neutrality resume, as if nothing has happened. Or else the story simply ends.

In the world of the camps, there is no predicting how or whether one will survive from one day (or hour) to the next. Plans for survival can be constructed and, in an instant--an "ordinary" instant of camp life--they can be just as abruptly destroyed.

Sometimes, this is a good thing. In Kolyma, the accidental can lead to life-threatening injury and death, or it can guarantee another day's survival.

I'm interested in what happens to "the gift of wonder" and the "quantum theory of disability" that Hockenberry posits as a component of human identity.

Does it survive in the "ordinary instants" of Shalamov's Kolyma? Does it mean something very different in the unpredictability that his sketches detail?

What are the implications of "the ordinary" and "the accidental" for narrative representation? How do we tell the story of what such accidents "mean" in the construction of human identity?

Monday, November 21, 2011


بنی آدم اعضای یک پیکرند
که در آفرينش ز یک گوهرند
چو عضوى به درد آورد روزگار
دگر عضو ها را نماند قرار
تو کز محنت دیگران بی غمی
نشاید که نامت نهند آدمی

This weekend was a perfect one: I accomplished more than I could have imagined doing, including all kinds of wonderful autumn activities like apple-picking (very wonderful), stacking firewood (somewhat less wonderful), planting close to 100 tulip bulbs (wonderful, but deferred until spring) and cooking, cooking, cooking (very, very, very wonderful).

All this, and still time for love and laughter with people I care about. What more can you ask of life, really?

So needless to say, I planned to blog in a state of relative bliss last night, but wouldn't you know it, I knocked my wine glass clear across the living room floor (luckily, I don't have carpet), so that required a major cleanup.

It actually spilled into my knitting basket prior to hitting the floor. Not good.

But I was still determined to blog, so I logged in and began an extended battle with html that I eventually won (of course), but by that point, I had a headache (of course) and no longer felt like blogging or even remembered what I had originally wanted to blog about.

Imagine my surprise when my html-induced headache wasn't gone this morning. I spent the day in a bit of a muddy head-fog, but I think I'm better now.

Still, I feel I should alert my gentle readers that, if my prose seems less crisp and clean than it usually does, it is, and that's why.

Anyway, when I wasn't being Miss Autumn-in-New-England, I was reading the Persian poet Sa'di, in preparation for the course on Central Eurasian literature that I'll be teaching in the spring.

Sa'di was a 13th century poet and writer who is best known for The Gulistan (The Rose Garden), written in 1258.

Sa'di traveled extensively for approximately thirty years of his life, a time that included many widespread changes in the Middle East and Central Eurasia, including the Mongols' sacking of the city of Baghdad in 1258 (the same year in which he wrote The Gulistan).

The Gulistan is a collection of short tales and pieces of advice--it's kind of like Machiavelli's The Prince, but with a soul. Although much of the advice involves rulers and leadership, there are also sections devoted to general life lessons, guidelines for interpersonal relationships, and exhortations of spirituality.

For example, Maxim #54 in the section, "Rules for Conduct in Life" notes,
The Imam Murshid Muhammad Ghazali, upon whom be the mercy of God, having been asked in what manner he had attained such a degree of knowledge, replied, "By not being ashamed to ask about things I did not know."
The conclusion Sa'di draws: "Ask what you know not; for the trouble of asking/ Will indicate to you the way to the dignity of knowledge."

My favorite piece of advice: "Either make no friends with elephant-keepers/ Or build a house suitable for elephants."

I mean, really. If you think about it, that pretty much says it all, doesn't it?

The quotation at the start of my entry tonight is Sa'di's most famous poem, "Bani Adam." It is inscribed on the entrance to the United Nations and, roughly translated, it reads,

The children of Adam are each others' limbs,
Created from one essence.
When the calamity of time affects one limb,
The others cannot rest.
If you have no sympathy for others' troubles,
You do not deserve to be called human.

Monday, November 14, 2011


I first read Twyla Tharp's The Creative Habit (2003) several years ago, and it's a book I like to return to every few years, to help me think about what creativity is, what it means, and how to sustain it.

Tharp is a firm believer in the fact that inspiration--and creativity--are "habits," a conception that runs counter to the idea that artists are always transcendental geniuses whose ideas arise (spontaneously, miraculously) out of nowhere, fully formed and ready to astound us all.

On the contrary, Tharp argues. Artists work. They do so daily and consistently, and often without inspiration. But always they do so with a sense of purpose and always with a sense that, when they are working, they are doing precisely what they should be doing--in effect, what they feel they were "meant" to do.

I have always liked Tharp's book, because so many people conceive of art as something breezy and inspirational--picked up, put down, automatically executed with brilliance and ultimately reducible to its final products. Her argument is a healthy reminder that art is equally about process, and processes are often, if not inevitably, messy and riddled with mis-starts and mistakes.

And the results of those messy processes are not always even "good," let alone brilliant. It's not always an inspired undertaking. You have to produce all kinds of potentially bad creations if you want to increase your chances of creating something potentially passable.

I think of the creative process as a merger of the ephemeral (that which is fleeting) with the tangible (that which is time-bound and concrete). Tharp offers the contrast between zoe and bios --between indiscriminate life and life in its lived details. Different artists are drawn to different perspectives and the great artists are the rare few who can capture both.

But I wonder whether artistic creation, when viewed from the perspective of the artist her- or himself, is always a constant adjustment of focus, a tweaking of the details and the indiscriminate--zooming in on bios only to then pan back on zoe --in an ongoing effort to communicate what would otherwise exist only in the artist's soul and psyche.

Artists work constantly and ceaselessly to give us the gift that they envision.

I think it's easy to declare oneself an artist, a gift-giver. It's much more difficult to live as one. I'm not referring simply to the lack of an income or the need to take an unrelated job and carve out time for one's work, to "suffer" for one's creation.

I'm thinking of the discipline required in coming up with an idea when you simply don't have one--or any, really. Or the discipline required to acknowledge that the project you've worked on so lovingly and laboriously is...well, crap.

And conversely, the willingness to realize that there is no perfection and that the repeated decision to declare everything one produces "crap" and refuse to offer it to the world's gaze is not a sign of artistic discipline, it's the symptom of a lack of resolve and a failure of courage.

The French intellectual and polymath Henri Poincare used his innate curiosity about--and eventual expertise in--a wide range of fields and interests to generate a vast body of scientific work without frittering away his time on dead-ends or distractions.

In his journals, Poincare would describe arriving at an impasse in a particular project or mathematical problem. Instead of doing what most of us would do, however, and continuing to beat his head against the wall and demand a solution, Poincare would move on to another project, often something completely unrelated or occasionally a project that had reached a similar impasse at an earlier point in time.

Poincare describes how, having set the problematic project aside, he would suddenly arrive at--perhaps not a "solution," per se, but an idea that would advance his thinking. Often, he claims, this would happen while getting into a carriage or engaging in some otherwise mundane activity unrelated to his intellectual endeavors.

He possessed the ability to live with his stumbling blocks and stopping points, instead of becoming consumed by them or prematurely branding them as "failures." In effect, Poincare's analysis of his own thinking process suggests that, while he was focused on something else, his brain was slowly but steadily continuing to work on the problem, somewhere just below the threshold of conscious activity.

The shift in perspective would eventually offer the insight he needed to rethink the dilemma. This ability to shift focus, frame and attitude is the hallmark of a productive and agile mind. I remember reading about how the teen tennis phenom Martina Hingis used to love to play sports other than tennis and how, contrary to the advice of most tennis pros, she refused to spend vast amounts of time simply playing tennis.

Instead, she felt that her time spent playing soccer taught her new things about her footwork on the tennis court. Likewise, her love of horseback riding taught her different ways of thinking about the landscape in front of her and adjusting to changes in perspective and momentum.

It was a kind of self-imposed regimen of neuroplasticity. I think this is the essence of any kind of mastery--whether physical or intellectual: an ongoing and innate willingness to render one's perspective simutaneously supple and attentive.

To see the bios in zoe , and then ... to look again, both elsewhere and otherwise.

Wednesday, November 9, 2011

"Troubling Confessions"

I'm teaching Dostoevsky this semester (and next semester as well), and as anyone familiar with his novels knows, confessions play a huge role in shaping characters and their plots--particularly when the characters in question are murderers, pedophiles and other unsavory types.

One of the most interesting books that I've read about confession as a legal and spiritual phenomenon is Peter Brooks' Troubling Confessions: Speaking Guilt in Law and Literature (2001).  Brooks melds an analysis of Supreme Court cases and their social and political implications with his own specialty--literary analysis--to offer a thoughtful and nuanced discussion of what "confession" is, how we understand it, what we expect from those who confess, and why we consider it "good for the soul" and essential for the moral and social rehabilitation of guilt.

In particular, Brooks argues that "our social and cultural attitudes toward confession suffer from uncertainties and ambivalences" and that as a result, "confession is a difficult and slippery notion to deal with" (3). 

In short, he argues, "We want confessions, yet we are suspicious of them" (3).

And, as he points out, we should be.  Perhaps one of Brooks' most interesting points of discussion revolves around people who confess to things that, as it turns out, they didn't actually do.  While most of us think, "Who on earth would be so stupid?", the fact is, most of us are entirely unaware of how easy it is to pressure another person into accepting an admission of guilt.

This pressure is exacerbated in contemporary American culture where, as Brooks points out, we "appear ... to live in a generalized demand for transparency that entails a kind of tyranny of the requirement to confess" (4).

We want to hear people "say the words."  Until they do, no amount of proof will satisfy us emotionally--only the confession can seal the psychological deal.

Except that legal history is rife with cases of false confessions or coerced confessions or "insincere" confessions.  So what does that say about this phenomenon upon which we rely so heavily?

Brooks cites several interesting examples.  Consider, for instance, the case of the Boorn brothers, accused of murdering their neighbor in Manchester, VT in 1819.  Although they asserted their innocence throughout their subsequent trial, while awaiting execution, they ultimately confessed to the crime (8).

Turns out, the neighbor wasn't even dead.  He'd simply moved to Schenectady.

Brooks also discusses the Supreme Court case Brewer v. Williams.  At issue was the extent to which police can use questioning and/or psychological "coercion" to obtain a confession.

In this case, the police were escorting a suspect to Des Moines, Iowa.  Williams is an escaped mental patient who is believed to have murdered a nine-year-old girl.  The girl's body has not been found.

The detective on the case has agreed to the stipulation of Williams' lawyer in Des Moines: Williams is not to be questioned on the drive from Davenport to Des Moines. 

What the detective does, however, challenges the definition of police "questioning": instead of interrogating Williams, he offers him food for thought, in the form of the "Christian Burial Speech."  Addressing Williams as a man of faith, he "proceeds to evoke the weather conditions, the forecast of several inches of snow, the likelihood that the young girl's body will be buried and unlocatable" (26).

He urges Williams to consider the fact that he alone can ensure that the girl's parents can give her "a Christian burial."  Williams ultimately leads police to various pieces of evidence, and finally to the girl's body itself.

Is this a legally obtained confession?  Williams committed the murder, so that is not at issue: the question is, whether his confession is invalid because the detective used improper means to obtain it.  Did he conduct what is, ostensibly, a kind of "interrogation," even though he never employed a single interrogative statement?

In the case of Brewer v. Williams, the Supreme Court decides, in a 5-4 split decision, that the confession is invalid (27).  The Court's decision asserts that the detective "deliberately and designedly set out to elicit information from Williams just as surely as--and perhaps more effectively than--if he had formally interrogated him" (27).

The effectiveness of the detective's line of (non-)questioning stems from the fact that, as Brooks acknowledges, confession plays an equally ambiguous role in its relationship to the person who does the confessing itself.  At times, the compulsion to confess one's guilt can become overwhelming--so overwhelming, the case of the Boorn brothers suggest, that it can override (or overwrite) the confessed act itself.

I didn't do "it," perhaps, but I did something, I must have done something, so...

This is the slippery slope of confession.  It is particularly interesting and complicated in the story of the visual artist Alan Bridge, better known, perhaps, as "Mr. Apology."  As Brooks outlines, in 1980, Bridge posted the following flyers around Manhattan:

Bridge's goal was an exhibit at the New Museum in 1981.  The result of the apology hotline, however, was more than simply a conceptual art project.  Bridge amassed over 1000 hours of confessions on his answering machine and continued to maintain the phone line even after the project was completed because people seemed to "need" it.

In the end, this question of who "needs" to confess--not to mention the questions of how, to what and why, exactly--has implications for all of us guilty innocents out there.

Sunday, November 6, 2011

Friedman's Capitalism and Freedom

A good week and a productive one: I finished the two paper proposals and sent them on their way, which meant that I finally had a chance to read Milton Friedman's Capitalism and Freedom--something I've been meaning to do since July, actually.

I'm not an economist, obviously, so my observations are just that: my observations.  I think Friedman's ideas are interesting, actually, but I also have some issues with them, so I'm going to be true to my moniker and write my impressions as a thinker who is still thinking about Friedman's philosophies and arguments.

I was not happy with the first chapter of Capitalism and Freedom.  Actually, I came dangerously close to having a conniption, but luckily, things settled down in the second and following chapters. 

I was left wondering whether a lot of people simply read the first chapter of Capitalism and Freedom, actually, because I'm pretty sure I've heard some of that chapter one mularky out there and I think it isn't really representative of Friedman's overall ideas and argument. 

I think this speaks to both the appeal and the danger of Friedman's style.  On the one hand, his ideas are very accessible and he expresses them clearly and forcefully.  I like that.  He clearly agrees with Einstein's adage, "If you can't explain it simply, you don't understand it well enough."

On the other hand, there's a danger in Friedman's style and approach.  It seems to me that, at times, he simplifies complex issues, not to make them understandable, but in order to create a very specific rhetorical effect and promote a very specific political agenda. 

He has clear biases (as we all do), but I think that his style makes it easy to miss the inconsistencies in his more propagandistic (if that's a real word?) statements.

An example.  He claims that "intellectuals" tend to be biased against economic freedom.  He states, "They tend to express contempt for what they regard as material aspects of life, and to regard their own pursuit of allegedly higher values as on a different plane of significance and as deserving of special attention" (8).

Come again?  I'll admit, this kind of statement is always going to tick me off, because I think that historically, Americans have often been anti-intellectual (and unfortunately, a lot of intellectuals have earned this negative reputation, obviously) to an extent that isn't present in a lot of other nations and cultures. 

I don't think tapping into that negative preconception serves any real purpose, though, and I object to it being used by someone who was a PROFESSOR at The University of Chicago... for 30 years.  After he studied at Rutgers and Columbia.  And he won the Nobel Prize in Economics too, for heaven's sake.

Friedman is an intellectual (and an academic), in my book.  Big time.  So he has no cause to be invoking derogatory labels about "intellectuals" and acting like he's not part of the academic system that clearly rewarded him.

In general, this was my primary complaint with the first chapter of Capitalism and Freedom: it made sweeping proclamations that paid no attention to historical nuance and at times, I felt it played fast-and-loose with facts in service of a larger political rhetoric. 

At times, I felt that, in order to make his political arguments, Friedman banked (sorry, can't resist the pun) on the fact that most people won't know he's broadly overstating or oversimplifying historical realities.

For example, I know of no Classics scholar out there who would blithely suggest that Ancient Greece was a capitalist society and that this explains the extensive political freedoms that Greek citizens enjoyed.  Such an assertion overlooks the fact that Ancient Greece was comprised of a variety of very different city-states (think "Athens" vs. "Sparta"), all of which were organized around very different political frameworks. 

It also overlooks the fact that the question of whether or not these different Ancient Greek civilizations participated in a market economy (as we understand the concept), has been extensively debated for years-- by Classical historians and economists alike.

So this is my main gripe with Friedman's overall approach.  I don't think it's acceptable to oversimplify a complex historical situation simply to make the rhetorical argument that you want to make for a general audience. 

It seems to me that this is using your intellectual knowledge to take advantage of your audience.  They won't know what they don't know, but you do, and you're using their potential lack of information for your own benefit.

All that said, I find the rest of Capitalism and Freedom interesting.  Friedman seems a bit idealistic about market economies in my opinion: he repeatedly identifies the market as a system of "voluntary cooperation" between the parties to an exchange. 

This would be nice, but I'm just not sure it's ever been the reality.  Friedman does address one of the main objections to his argument--namely, the formation of monopolies--but I think he tends to downplay the effects of monopolies on consumers. 

As I've said, I'm no economist, but my sense is that the goal of capitalism is to "corner the market"--to generate demand and engage in business practices that ultimately ensure that you are the primary (if not the sole) supplier of goods and services so that you can make a honking big profit at the end of the day.

As I read Friedman's philosohy, I kept trying to think of a single time when I've heard a competitive capitalist say, "Gosh, I'm just so glad I have so many darn competitors in my line of business.  It means that there is just that much more chance for all kinds of voluntary cooperation among buyers and sellers.  That's really what it's all about, after all."

Friedman also seems to believe that ultimately, a capitalist market can and will stabilize itself, if left to its own devices.  Although he's often caught in the philosophical dragnet of people who want to abolish the Fed and return to the gold standard, in fact, Friedman advocates neither of these courses of action in Capitalism and Freedom.

He's not a fan of the Federal Reserve, but his concerns primarily center around the ways in which, in the first half of the 20th century, the Federal Reserve made what were, in his opinion, bad decisions or failed to act when they should have, in order to avert financial crises.   They either did nothing or, when they did do something, they did the wrong thing. 

He doesn't suggest abolishing the Fed, though, just that it needs to be managed more effectively and that we should perhaps reconceptualize its role in the economy (a role that, in his opinion, should be minimal).

His chapter on fiscal policy in particular is quite interesting.  He objects to the use of the federal budget as a kind of "balance wheel" designed to offset a decline in private expenditures, and insists that, "[f]ar from being a balance wheel offsetting other forces making for fluctuations, the federal budget has if anything been itself a major source of disturbance and instability" (77).

So I find that, although I don't always like his tactics, many of Friedman's claims offer substantial food for thought.  For instance, he points out that, "In fiscal policy as in monetary policy, all political considerations aside, we simply do not know enough to be able to use deliberate changes in taxation or expenditures as a sensitive stabilizing mechanism" (78).

Seems to me like somebody ought to put that on a big old posterboard and wave it in front of everyone currently in power in Washington.