Serendip is an independent site partnering with faculty at multiple colleges and universities around the world. Happy exploring!

Blink: The Power of Thinking Without Thinking

kdilliplan's picture
Normal 0 false false false MicrosoftInternetExplorer4

            In his book Blink: The Power of Thinking Without Thinking (2005), Malcolm Gladwell uses neurobiology and psychology to explore the human decision-making process.  Using a diverse series of case studies, Gladwell shows how the difference between good and bad decision-making has little to do with the amount of knowledge and information available, but more with what we do with a surprisingly small amount of detail.  One might say it is a book about intuition, but Gladwell does not think that the word “intuition” is an accurate one to use.  In fact, the word never appears in the book.  He says “[i]ntuition strikes me as a concept we use to describe emotional reactions, gut feelings - thoughts and impressions that don’t seem entirely rational.  But I think that what goes on in those first two seconds is perfectly rational.”  The first two seconds he refers to are explored fully in the course of Blink.  In a process Gladwell calls “Thin Slicing,” humans are able to almost instantaneously process a minimal amount of information and make “snap decisions” that at some times can be remarkably sound and at others remarkably destructive.  Gladwell supports the complimentary ideas that a little bit of information goes a long way in the decision-making process, while copious amounts of information can often prove to be too much information, which blocks our innate ability to make decisions.  He also talks extensively about implicit associations humans become hardwired with in society that affect our decisions, the basis for Blink’s social commentary, mostly centered around the implicit bias against race and to a lesser degree gender.  The challenge that Gladwell poses is to reliably be able to figure out the right balance between conscious deliberation and instinctive judgment and use that balance in our daily lives. 

            The book begins with an example of Thin Slicing.  The J Paul Getty Museum in California acquired a sculpture which was quickly “proved” to be authentic through a series of tests devised by everyone from art historians to geologists.  However, a small handful of non-experts had a bad feeling about the sculpture as soon as they saw it, though they could not articulate how they knew something was wrong.  Sure enough, further investigation revealed that the statue was in fact a very convincing fake.  In a few short seconds, using only the basest of information, those few people saw through all the careful work of many experts.  Gladwell returns often to this example, both as an example of a little bit of information going a long way and as an example of how too much information can be a hindrance to forming good decisions.

            I found one example of thin slicing particularly interesting.  Studies were done in which married couples of all different backgrounds were video taped while having a discussion about a minor point of contention in their relationships.  At the end of the study, the people running it could watch an hour of video tape and predict, with 95% accuracy, whether the relationship would end in divorce.  Then, they were shown only 15 minutes of tape, still their success rate was above 90%.  Even having been shown only 3 minutes of tape, they could predict the outcome of the marriage with 75-80% accuracy.  It didn’t matter how long the couple had been together, or how strong they thought their marriage was at the time the tapes were made, the outcomes were still predictable, based almost solely on information provided by facial expression and tone of voice.

            Gladwell spends a lot of time cautioning against acquiring too much information for use in decision-making.  He tells the story of Paul Van Riper, a retired Vietnam veteran, who was asked to play the role of the rogue military commander in the Millennium Challenge, a military training experiment.  The Blue Team, standing in for the US military, was given access to very high-tech equipment and all the information and reconnaissance they could every imagine having.  The Red Team, led by Paul Van Riper, was given similar technology.  It was assumed that the Blue Team, with the help of their information network, would defeat the Red Team quickly and easily.  However, Van Riper chose to utilize little of their technology and instead gave his team orders so that they would make spontaneous and seemingly irrational decisions.  The Blue Team assumed that the Red Team would act in a way that all of the Blue Team’s equipment told them they would.  Because they were locked into such a narrow system of operation, the Red Team beat them soundly.  It was only after Van Riper was told to essentially follow a script that the Blue Team could defeat them, despite - or perhaps because of - all of their information and technology.

            Gladwell also spends a lot of time talking about implicit associations and how they shape the way we think about things.  One study in particular was very disturbing to me.  The subjects were presented with words or pictures on a screen, and their task was to assign each word or picture to its proper category.  Their responses were timed and compared between two different rounds.  In the first round, the categories were “European-American OR Good” and “African-American OR Bad.”  In the second round, the groups were reversed: “African-American OR Good” and “European-American OR Bad.”  No matter who was taking the test, the response times were slower in the second round than in the first.  This has nothing to do with the opinions or beliefs of the test-taker, but has everything to do with the implicit biases society imparts which can interrupt our rational thought processes.  Similarly, Gladwell recounts the story of a female musician auditioning for a professional orchestra.  For many years, orchestras were exclusively male, with the understanding that men were inherently more suited to being musicians than women ever would be.  However, once orchestras began using blind audition processes - ones in which the person auditioning is never seen, and the people hearing the audition are given only a number and hear only the person’s playing - the numbers of women in orchestras increased fivefold.  Gladwell cites this as an example of implicit bias interrupting rational decision-making.  In the Afterword, he argues that such implicit bias is rampant in the judicial system and that for justice to ever be completely fair, it must be literally blind.  He believes that people on trial should never be seen by their jury, and all of their statements and responses to questions should have any indicators of gender, class, race, etc removed before being presented to the jury in order to eliminate bias.  

            I was struck very quickly by how accessible this book is.  I came to it with little to no background in neurobiology or psychology and yet I found each point to be well-crafted, well-presented and very understandable.  As a scientist, I found myself wanting to know how our brains work the way they do, but such knowledge is not necessary to understand the book.  On the other hand, I never found myself feeling insulted by oversimplification, which I have often run into when reading books about topics with which I am not familiar.  I suppose it is Gladwell’s background as a journalist that allows him to achieve this effect, but I was very impressed with it.  Many of the examples in the book are interactive, using exercises similar to ones used in the case studies so the reader can become part of the same studies.  I think this is a main component of how accessible the book is.  I would highly recommend this book to anyone who is interested in the decision-making process or especially anyone who wants a new perspective on how and why they think about things.

Comments

Tim's picture

Not really

Van Riper's account of Millenium Challenge 2002 is largely nonsense: if you look into the BLUEFORCE commander's side of the story, you quickly discover that the amazing "victory" that Van Riper pulled off was actually the result of a simulation error (the simulation moved the BLUE ship markers to where the physical ships were, right next to the coast, allowing for an ambush because the ships teleported into the middle of Riper's simulated ships). Another "trick" he pulled was to deploy chemical weapons in areas where paratroops were going to land, getting a shedload of casualties because the exercise only had access to C-17s to drop paratroops for 18 hours, something that would never be an issue in real life. At a very basic level, Riper did not understand the difference between exploiting an operational flaw and exploiting a flaw in the simulation.

Plus most of his "innovative tactics" were outright cheating, such as claiming the US fleet wouldn't examine fishing boats, respawning his own forces multiple times, having his bicycle couriers deliver messages instantaneously, putting 5,100-pound Termit anti-ship missiles on 25-foot fibreglass speedboats (which were being disregarded for purposes of the simulation even though they wouldn't be in real life), etc. He was chucked off because he was threatening to ruin a multimillion dollar exercise.