I've once again been MIA for over a week and, once again, the days have passed in the blink of an eye.
Which is somehow fitting, because I've been reading Malcolm Gladwell's Blink: The Power of Thinking Without Thinking (2007). Although Gladwell has come under fire for his work--in particular, for the fact that his information and narrative style are often organized around anecdotes rather than concrete data--I did enjoy Blink and I think it offers some interesting points about the way human thinking often works (or doesn't work).
Blink became enormously popular because, for many, Gladwell seemed to be giving us all kinds of permission to stop over-thinking everything and simply "go with our gut" to make decisions that intuitively "feel right."
And, yes, on some level, Blink explores the potential benefits of doing just this.
Except that Blink is also about the very real dangers of doing just this. Written seven years ago, Gladwell's book is particularly timely right now, because it also reflects on what happens and how things can go terribly, terribly wrong when people make snap decisions based on unexamined impressions.
As the Bronx police officers did in the shooting death of the unarmed Amadou Diallo in February 1999, an incident that Blink examines in some depth.
Gladwell probes the information offered by ongoing studies in stereotyping, one of the most insidious forms of "blink" thinking. When we stereotype, we allow the unexamined, illogical gut reactions that have been insidiously (or not-so-insidiously) instilled in us by our surrounding environments and social contexts to determine our behavior. And we do this even though these gut reactions in fact contradict what we consciously want to (or even believe that we) believe about others.
We act on impulses or ideas we didn't realize we had and that we might never consciously choose to act on, if we knew we had them.
Gladwell points out that, contrary popular belief, extreme stress only enhances quick thinking and decision-making abilities if a person's heart rate falls within a very specified range--between 114 and 145 beats per minute.
In this range, individuals seemed prone to experience the sensation that "everything was moving in slow motion" and they were able to process events and react with feelings of keen acuity. This is the stuff on which action-adventure films capitalize (think of the many scenes in "The Matrix," "Crouching Tiger, Hidden Dragon," "Kill Bill," etc. etc.)
Beyond 145 beats per minute, however, all bets are off, as complex motor skills begin to be impaired and vision becomes restricted. Aggressive behavior mounts to a point at which people can become inappropriately aggressive. In these cases, their reaction is not keen or intuitive, it's simply an insane reaction to an insane amount of stress.
At a heart rate of 175 beats per minute, stressed-out individuals will often void their bowels. In such situations, the body is under such duress that it physiologically overrides the more reasoned "you-need-to-wait-until-you-can-find-a-bathroom" impulse ingrained in us through potty-training.
I don't think you'll get Uma Thurman or Angelina Jolie to do their own stunts in that kind of action-adventure scene.
If a person's heart rate is greater than 175, s/he is little more than a stressed-out loose cannon. His/her perceptions of reality are no longer reliable. All cognitive processing has stopped and the primitive emotional centers of the brain have taken over.
As Dave Grossman, former army lieutenant colonel and author of On Killing (2009) has argued, the person has become the human equivalent of an angry dog--they can't be reasoned with.
Under these conditions, a person may in fact do things that s/he wouldn't otherwise do and s/he may also be unable to do things that, to us, seem quite simple--this is why, in emergencies, many people dial 411 instead of 911.
This may also explain why, in cases of police brutality, incidents of racist behavior and stereotyping are both painfully obvious and preternaturally extreme. Stereotypes are rigid systems of thought that are ingrained in us in ways that operate below the level of consciousness. (Ironically, our brains seem to subconsciously store stereotypes in the same basic category as the impulse to spontaneously void our bowels.)
When it starts hitting the fan, stereotypes all-too-easily become the (subconscious) default settings that guide actions and perceptions. This becomes particularly likely if events steadily escalate to such an extreme (think: "life or death") that the people involved are no longer processing reality correctly.
If you're thinking, "Well, luckily, this can't happen to me because I don't have any engrained stereotypes or implicit, intuitive thoughts of this nature," you need to check out Project Implicit.
Project Implicit is a study designed to measure our implicit (think: split-second) social attitudes--those elements of our reactions that we can't control and that we may not even realize we have. It measures the way in which our intuitive responses have been shaped--without our realizing it--by the social contexts in which we live and by which we've been systematically conditioned.
I can almost promise that you won't like what you find out about yourself, if you take one of Project Implicit's 5-minute tests. (They actually have a disclaimer warning people that they may not be happy with their test results: you have to explicitly agree to be willing to read opinions that may disagree with your own deeply-held beliefs about who you are and how you think, feel, and react to people of different races, sexualities, genders, and abilities.)
As Gladwell points out, studies have shown that the way to counteract incidents of police brutality is not simply through community outreach programs or sensitivity training, but through the better enforcement of proper police training techniques, on the ground and in real-time.
The only way to avoid the bad, split-second decisions that result from the distorted and inaccurate perceptions brought on by stress-reactions is to try to slow down the unfolding of the event itself.
Police procedures are designed to do exactly this, but if officers are less than diligent in adhering to those procedures when approaching what might be an unfolding crime-scene, before a suspect is encountered, they may be missing opportunities that would give them the additional time and space needed to react appropriately and rationally.
Police officers have to respond quickly: their ability to do their job relies on immediate, accurate, and appropriate responses to highly stressful situations. No one questions the inherent difficulty of doing this.
At the same time, however, police officers must walk a fine line. If they respond too quickly and/or succumb to the inherent stresses of the situation itself, they will simply not be able to do their job appropriately.
It's easy to sit on the sidelines and judge or speculate about what we ourselves "would do" when faced with such situations, but the fact of matter is, in situations of severe stress, when individuals believe that their very lives are in jeopardy, they aren't themselves.
In such moments, people make snap decisions that lead them to do things that they--and others around them--have a hard time believing that they are capable of.
In the wake of the recent grand jury acquittals, my guess--and this is only a guess--is that, when called upon to testify about incidents of police brutality, the police officers involved may offer testimony that is very emotionally compelling and oddly... persuasive... to those who are sitting in the jury box, listening to it. It's highly probable that the officers involved really do believe that they saw what they saw, and that they really did think that they had to do what they did--because they weren't "thinking" in the way that you and I think of "thinking."
I suspect that grand juries and police brutality cases are, on a very basic level, recipes for legal disaster. Calm, quiet people in an unstressed situation are being asked to weigh evidence and sit in judgment on the rights and wrongs of highly emotional circumstances that involve stereotyping, high-stress reactivity and scenes of significant social disparity--elements that in turn culminate in violence and death.
I think that, rather than try to solve the problems that are clearly occurring when grand juries are asked to decide police brutality cases, a better way to protect everyone involved--both the police and the public--might be to use what we already know to be true to better train police and inform the public.
For example, psychologists and police officials already know that high-speed chases--or chases of any sort, actually--get out of control very quickly. The heart rate of everyone involved becomes extremely elevated, perceptions become skewed, and the ability to accurately process who is doing what and when (to say nothing of why) is severely compromised.
The key seems to be to do everything possible to avoid forcing--or allowing--split-second decisions to be made under such circumstances. Because, more often than not, those decisions will be seriously flawed and based on misperceptions.
A lifetime of consequences can result from a bad decision made in the blink of an eye. Individual actions and reactions can--and do--have large-scale social ramifications, as we have all witnessed in the events of recent months.
Another option is to better train people in ways that can help inoculate them against the debilitating effects of stress. If you systematically accustom a person to physiological stress, you can lower their resulting heart rate and potentially keep them within that narrow window of opportunity in which split-second decisions aren't inherently misguided and disastrous. (The key word being "potentially.")
Ultimately, I think we are best served by remembering one of the key points that Gladwell makes: split-second decisions are precisely that. Split-second.
Standing on the outside and reviewing events that have already occurred, we tend to think of them as unfolding across a much longer expanse of time. It always "feels" as if all of the parties involved had enough time to make better decisions, and we tend to assume that they naturally had the presence of mind--that they were still enough of "themselves," in short--to do what was obviously sensible and appropriate.
We know what they "should" have done, we think of this as unfolding in an indeterminate expanse of time, and we can't imagine why they did otherwise.
And sometimes, yes, events do seem to unfold in just this way. For instance, Gladwell cites the story of a police officer who confronted a 14-year-old gang member in flight who, in fact, did have a gun and who was reaching for it at the moment when the officer confronted him.
The police officer claims that he didn't shoot because, throughout the incident, he remained instinctively aware of two things: the suspect's age ("He was fourteen, looked like he was nine"), and the fact that, as the arresting officer, he had time--as he put it, "something in his mind" told him that he "didn't have to shoot yet."
The police officer instinctively "felt" that he had time to give the boy (and note: he never lost sight of the fact that the person confronting him was, in fact, a child) the benefit of the doubt, and that was all it took to change the outcome.
As Gladwell points out, the attempted assassination of Ronald Reagan in 1981 was an incident that lasted, in its entirety, a total of 1.8 seconds. The attempted assassination of the president of South Korea (an event that resulted in the deaths of the president's wife and an 8-year-old boy) lasted all of 3.5 seconds.
The incident involving the police officer described above was similarly brief, and yet it resulted in a very different outcome.
Gladwell's Blink also looks at the potential benefits of intuitive decision making, of course, and I find his ideas and arguments on this front equally interesting and compelling.
Anyone who works in academia knows that, at times, the influx of research and ideas and acronyms and "information" and statistics and who knows what-all can get positively overwhelming.
Research any paper or idea long enough and, by the time you're through, you won't know what it is that you want to argue. There's always more reading to be done, but at some point, more reading isn't going to clarify anyone's thinking.
You're simply going to be buried in information that will ultimately impede rather than enhance your decision-making abilities. In those instances, it's a case of becoming fundamentally unable to see the forest through the trees.
Gladwell describes a particularly compelling instance of this when he chronicles the experiences of General Paul Van Riper in the Pentagon's Millennium Challenge war games in 2000. Riper led the "Red Team"--the putative enemy of the United States' Blue Team.
The Blue Team had all of the advantages of technology and research on their side. They devised strategy upon strategy, they created acronyms, they "knew" all about their enemy and his vulnerabilities. As Gladwell observes, "With the Millennium Challenge ... the Blue Team was given greater intellectual resources than perhaps any army in history."
Van Riper's Red Team, on the other hand, was to be led by a "virulently anti-American" "rogue military commander" who "had broken away from his government somewhere in the Persian Gulf." He was "harboring and sponsoring four different terrorist organizations" and he exhibited "a considerable power base from strong religious and ethnic loyalties" that "threaten[ed] to engulf the entire region in war."
Needless to say, the Blue Team felt it had the clear advantage. Until the Red Team sank 16 of their ships in a surprise attack.
Van Riper's explanation was simple: rather than running elaborate scenarios or conducting extensive research, he simply noted that the Blue Team would "adopt a strategy of preemption."
So he struck first. And in the blink of an eye, it was over.
This is the essence of Blink. Thought is good. Information is good. Training is necessary. And insight is indispensable.