“Some of these stories are closer to my own life than others are, but not one of them is as close as people seem to think.” Alice Murno, from the intro to Moons of Jupiter

"Talent hits a target no one else can hit; genius hits a target no one else can see." Arthur Schopenhauer

“Why does everything you know, and everything you’ve learned, confirm you in what you believed before? Whereas in my case, what I grew up with, and what I thought I believed, is chipped away a little and a little, a fragment then a piece and then a piece more. With every month that passes, the corners are knocked off the certainties of this world: and the next world too. Show me where it says, in the Bible, ‘Purgatory.’ Show me where it says ‘relics, monks, nuns.’ Show me where it says ‘Pope.’” –Thomas Cromwell imagines asking Thomas More—Wolf Hall by Hilary Mantel

My favorite posts to get started: The Self-Righteousness Instinct, Sabbath Says, Encounters, Inc., and What Makes "Wolf Hall" so Great?.

Tuesday, June 26, 2012

Stories, Social Proof, & Our Two Selves


            You’ll quickly come up with a justification for denying it, but your response to a story is influenced far more by other people’s responses to it than by your moment-to-moment experience of reading or watching it. The impression that we either enjoy an experience or we don’t, that our enjoyment or disappointment emerges directly from the scenes, sensations, and emotions of the production itself, results from our cognitive blindness to several simultaneously active processes that go into our final verdict. We’re only ever aware of the output of the various algorithms, never the individual functions.

            None of us, for instance, directly experiences the operation of what psychologist and marketing expert Robert Cialdini calls social proof, but its effects on us are embarrassingly easy to measure. Even the way we experience pain depends largely on how we perceive others to be experiencing it. Subjects receiving mild shocks not only report them to be more painful when they witness others responding to them more dramatically, but they also show physiological signs of being in greater distress.

            Cialdini opens the chapter on social proof in his classic book Influence: Science and Practice by pointing to the bizarre practice of setting television comedies to laugh tracks. Most people you talk to will say canned laughter is annoying—and they’ll emphatically deny the mechanically fake chuckles and guffaws have any impact on how funny the jokes seem to them. The writers behind those jokes, for their part, probably aren’t happy about the implicit suggestion that their audiences need to be prompted to laugh at the proper times. So why do laugh tracks accompany so many shows? “What can it be about canned laughter that is so attractive to television executives?” Cialdini asks.


Why are these shrewd and tested people championing a practice that their potential watchers find disagreeable and their most creative talents find personally insulting? The answer is both simple and intriguing: They know what the research says. (98)
As with all the other “weapons of influence” Cialdini writes about in the book, social proof seems as obvious to people as it is dismissible. “I understand how it’s supposed to work,” we all proclaim, “but you’d have to be pretty stupid to fall for it.” And yet it still works—and it works on pretty much every last one of us. Cialdini goes on to discuss the finding that even suicide rates increase after a highly publicized story of someone killing themselves. The simple, inescapable reality is that when we see someone else doing something, we become much more likely to do it ourselves, whether it be writhing in genuine pain, laughing in genuine hilarity, or finding life genuinely intolerable.

            Another factor that complicates our responses to stories is that, unlike momentary shocks or the telling of jokes, they usually last long enough to place substantial demands on working memory. Movies last a couple hours. Novels can take weeks. What this means is that when we try to relate to someone else what we thought of a movie or a book we’re relying on a remembered abstraction as opposed to a real-time recording of how much we enjoyed the experience. In his book Thinking, Fast and Slow, Daniel Kahneman suggests that our memories of experiences can diverge so much from our feelings at any given instant while actually having those experiences that we effectively have two selves: the experiencing self and the remembering self. To illustrate, he offers the example of a man who complains that a scratch at the end of a disc of his favorite symphony ruined the listening experience for him.“But the experience was not actually ruined, only the memory of it,” Kahneman points out. “The experiencing self had had an experience that was almost entirely good, and the bad end could not undo it, because it had already happened” (381). But the distinction usually only becomes apparent when the two selves disagree—and such disagreements usually require some type of objective recording to discover. Kahneman explains,
Confusing experience with the memory of it is a compelling cognitive illusion—and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living, and it is the one that makes decisions. What we learn from the past is to maximize the qualities of our future memories, not necessarily of our future experiences. This is the tyranny of the remembering self. (381)
Kahneman suggests the priority we can’t help but give to the remembering self explains why tourists spend so much time taking pictures. The real objective of a vacation is not to have a pleasurable or fun experience; it’s to return home with good vacation stories.

            Kahneman reports the results of a landmark study he designed with Don Redelmeier that compared moment-to-moment pain recordings of men undergoing colonoscopies to global pain assessments given by the patients after the procedure. The outcome demonstrated that the remembering self was remarkably unaffected by the duration of the procedure or the total sum of pain experienced, as gauged by adding up the scores given moment-to-moment during the procedure. Men who actually experienced more pain nevertheless rated the procedure as less painful when the discomfort tapered off gradually as opposed to dropping off precipitously after reaching a peak. The remembering self is reliably guilty of what Kahneman calls “duration neglect,” and it assesses experiences based on a “peak-end rule,” whereby the “global retrospective rating” will be “well predicted by the average level of pain reported at the worst moment of the experience and at its end” (380). Duration neglect and the peak-end rule probably account for the greater risk of addiction for users of certain drugs like heroine or crystal meth, which result in rapid, intense highs and precipitous drop-offs, as opposed to drugs like marijuana whose effects are longer-lasting but less intense.

            We’ve already seen that pain in real time can be influenced by how other people are responding to it, and we can probably extrapolate and assume that the principle applies to pleasurable experiences as well. How does the divergence between experience and memory factor into our response to stories as expressed by our decisions about further reading or viewing, or in things like reviews or personal recommendations? For one thing, we can see that most good stories are structured in a way that serves not so much as a Jamesian “direct impression of life,”i.e. as reports from the experiencing self, but much more like the tamed abstractions Stevenson described in his “Humble Remonstrance” to James. As Kahneman explains,
A story is about significant events and memorable moments, not about time passing. Duration neglect is normal in a story, and the ending often defines its character. The same core features appear in the rules of narratives and in the memories of colonoscopies, vacations, and films. This is how the remembering self works: it composes stories and keeps them for future reference. (387)
            Now imagine that you’re watching a movie in a crowded theater. Are you influenced by the responses of your fellow audience members? Are you more likely to laugh if everyone else is laughing, wince if everyone else is wincing, cheer if everyone else is cheering? These are the effects on your experiencing self. What happens, though, in the hours and days and weeks after the movie is over—or after you’re done reading the book? Does your response to the story start to become intertwined with and indistinguishable from the cognitive schema you had in place before ever watching or reading it? Are your impressions influenced by the opinions of critics or friends whose opinions you respect? Do you give a celebrated classic the benefit of the doubt, assuming it has some merit even if you enjoyed it much less than some less celebrated work? Do you read into it virtues whose real source may be external to the story itself? Do you miss virtues that actually are present in less celebrated stories?

             Taken to its extreme, this focus on social proof leads to what’s known as social constructivism. In the realm of stories, this would be the idea that there are no objective criteria at all with which to assess merit; it’s all based on popular opinion or the dictates of authorities. Much of the dissatisfaction with the so-called canon is based on this type of thinking. If we collectively decide some work of literature is really good and worth celebrating, the reasoning goes, then it magically becomes really good and worth celebrating. There’s an undeniable kernel of truth to this—and there’s really no reason to object to the idea that one of the things that makes a work of art attention-worthy is that a lot of people are attending to it. Art serves a social function after all; part of the fun comes from sharing the experience and having conversations about it. But I personally can’t credit the absolutist version of social constructivism. I don’t think you’re anything but a literary tourist until you can make a convincing case for why a few classics don’t deserve the distinction—even though I acknowledge that any such case will probably be based largely on the ideas of other people.

            The research on the experiencing versus the remembering self also suggests a couple criteria we can apply to our assessments of stories so that they’re more meaningful to people who haven’t been initiated into the society and culture of highbrow literature. Too often, the classics are dismissed as works only English majors can appreciate. And too often, they're written in a way that justifies that dismissal. One criterion should be based on how well the book satisfies the experiencing self: I propose that a story should be considered good insofar as it induces a state of absorption. You forget yourself and become completely immersed in the plot. Mihaly Csikszentmihalyi calls this state flow, and has found that the more time people spend in it the happier and more fulfilled they tend to be. But the total time a reader or viewer spends in a state of flow will likely be neglected if the plot never reaches a peak of intensity, or if it ends on note of tedium. So the second criterion should be how memorable the story is. Assessments based on either of these criteria are of course inevitably vulnerable to social proof and idiosyncratic factors of the individual audience member (whether I find Swann’s Way tedious or absorbing depends on how much sleep and caffeine I’ve had). And yet knowing what the effects are that make for a good aesthetic experience, in real time and in our memories, can help us avoid the trap of merely academic considerations. And knowing that our opinions will always be somewhat contaminated by outside influences shouldn’t keep us from trying to be objective any more than knowing that surgical theaters can never be perfectly sanitized should keep doctors from insisting they be as well scrubbed and scoured as possible. 

Monday, June 18, 2012

The Mental Illness Zodiac: Why the DSM 5 Won't Be Anything But More Pseudoscience


            Thinking you can diagnose psychiatric disorders using checklists of symptoms means taking for granted a naïve model of the human mind and human behavior. How discouraging to those in emotional distress, or to those doubting their own sanity, that the guides they turn to for help and put their faith in to know what’s best for them embrace this model. The DSM has taken it for granted since its inception, and the latest version, the DSM 5, due out next year, despite all the impediments to practical usage it does away with, despite all the streamlining, and despite all the efforts to adhere to common sense, only perpetuates the mistake. That the diagnostic categories are necessarily ambiguous and can’t be tied to any objective criteria like biological markers has been much discussed, as have the corruptions of the mental health industry, including pharmaceutical companies’ reluctance to publish failed trials for their blockbuster drugs, and clinical researchers who make their livings treating the same disorders they lobby to have included in the list of official diagnoses. Indeed, there’s good evidence that prognoses for mental disorders have actually gotten worse over the past century. What’s not being discussed, however, is the propensity in humans to take on roles, to play parts, even tragic ones, even horrific ones, without being able to recognize they’re doing so.

            In his lighthearted, mildly satirical but severely important book on self-improvement 59 Seconds: Change Your Life in Under a Minute, psychologist Richard Wiseman describes an experiment he conducted for the British TV show The People Watchers. A group of students spending an evening in a bar with their friends was given a series of tests, and then they were given access to an open bar. The tests included memorizing a list of numbers, walking along a line on the floor, and catching a ruler dropped by experimenters as quickly as possible. Memory, balance, and reaction time—all areas our performance diminishes in predictably as we drink. The outcomes of the tests were well in-keeping with expectation as they were repeated over the course of the evening. All the students did progressively worse the more they drank. And the effects of the alcohol were consistent throughout the entire group of students. It turns out, however, that only half of them were drinking alcohol.

At the start of the study, Wiseman had given half the participants a blue badge and the other half a red badge. The bartenders poured regular drinks for everyone with red badges, but for those with blue ones they made drinks which looked, smelled, and tasted like their alcoholic counterparts but were actually non-alcoholic. Now, were the students with the blue badges faking their drunkenness? They may have been hamming it for the cameras, but that would be true of the ones who were actually drinking too. What they were doing instead was taking on the role—you might even say taking on the symptoms—of being drunk. As Wiseman explains,

Our participants believed that they were drunk, and so they thought and acted in a way that was consistent with their beliefs. Exactly the same type of effect has emerged in medical experiments when people exposed to fake poison ivy developed genuine rashes, those given caffeine-free coffee became more alert, and patients who underwent a fake knee operation reported reduced pain from their “healed” tendons. (204)

After being told they hadn’t actually consumed any alcohol, the students in the blue group “laughed, instantly sobered up, and left the bar in an orderly and amused fashion.” But not all the natural role-playing humans engage in is this innocuous and short-lived.

            In placebo studies like the one Wiseman conducted, participants are deceived. You could argue that actually drinking a convincing replica of alcohol or taking a realistic-looking pill is the important factor behind the effects. People who seek treatment for psychiatric disorders aren’t tricked in this way; so what would cause them to take on the role associated with, say, depression, or bipolar? But plenty of research shows that pills or potions aren’t necessary. We take on different roles in different settings and circumstances all the time. We act much differently at football games and rock concerts than we do at work or school. These shifts are deliberate, though, and we’re aware of them, at least to some degree, when they occur. But many cues are more subtle. It turns out that just being made aware of the symptoms of a disease can make you suspect that you have it. What’s called Medical Student Syndrome afflicts those studying both medical and psychiatric diagnoses. For the most part, you either have a biological disease or you don’t, so the belief that you have one is contingent on the heightened awareness that comes from studying the symptoms. But is there a significant difference between believing you’re depressed and having depression? There answer, according to check-list diagnosis, is no. 

            In America, we all know the symptoms of depression because we’re bombarded with commercials, like the one that uses squiggly circle faces to explain that it’s caused by a deficit of the neurotransmitter serotonin—a theory that had already been ruled out by the time that commercial began to air. More insidious though are the portrayals of psychiatric disorders in movies, TV series, or talk shows—more insidious because they embed the role-playing instructions in compelling stories. These shows profess to be trying to raise awareness so more people will get help to end their suffering. They profess to be trying to remove the stigma so people can talk about their problems openly. They profess to be trying to help people cope. But, from a perspective of human behavior that acknowledges the centrality of role-playing to our nature, all these shows are actually doing is shilling for the mental health industry, and they are probably helping to cause much of the suffering they claim to be trying to assuage.

            Multiple Personality Disorder, or Dissociative Identity Disorder as it’s now called, was an exceedingly rare diagnosis until the late 1970s and early 1980s when its incidence spiked drastically. Before the spike, there were only ever around a hundred cases. Between 1985 and 1995, there were around 40,000 new cases. What happened? There was a book and a miniseries called Sybil starring Sally Field that aired in 1977. Much of the real-life story on which Sybil was based has been cast into doubt through further investigation (or has been shown to be completely fabricated). But if you’re one to give credence to the validity of the DID diagnosis (and you shouldn’t), then we can look at another strange behavioral phenomenon whose incidence spiked after a certain movie hit the box offices in the 1970’s. Prior to the release of The Exorcist, the Catholic church had pretty much consigned the eponymous ritual to the dustbins of history. Lately, though, they’ve had to dust it off. The Skeptic’s Dictionary says of a TV series devoted to the exorcism ritual, or the play rather, on the Sci-Fi channel,

The exorcists' only prop is a Bible, which is held in one hand while they talk down the devil in very dramatic episodes worthy of Jerry Springer or Jenny Jones. The “possessed” could have been mentally ill, actors, mentally ill actors, drug addicts, mentally ill drug addicts, or they may have been possessed, as the exorcists claimed. All the participants shown being exorcized seem to have seen the movie “The Exorcist” or one of the sequels. They all fell into the role of husky-voiced Satan speaking from the depths, who was featured in the film. The similarities in speech and behavior among the “possessed” has led some psychologists such as Nicholas Spanos to conclude that both “exorcist” and “possessed” are engaged in learned role-playing.

If people can somehow inadvertently fall into the role of having multiple personalities or being possessed by demons, it’s not hard to imagine them hearing about, say, bipolar, briefly worrying that they may have some of the symptoms, and then subsequently taking on the role, even the identity of someone battling bipolar disorder.

            Psychologist Dan McAdams theorizes that everyone creates his or her own “personal myth,” which serves to give life meaning and trajectory. The character we play in our own myth is what we recognize as our identity, what we think of when we try to answer the question “Who am I?” in all its profundity. But, as McAdams explains in The Stories We Live By: Personal Myths and the Making of the Self,

Stories are less about facts and more about meanings. In the subjective and embellished telling of the past, the past is constructed—history is made. History is judged to be true or false not solely with respect to its adherence to empirical fact. Rather, it is judged with respect to such narrative criteria as “believability” and “coherence.” There is a narrative truth in life that seems quite removed from logic, science, and empirical demonstration. It is the truth of a “good story.” (28-9)
Dan McAdams

The problem when it comes to diagnosing psychiatric disorders is that the checklist approach tries to use objective, scientific criteria, when the only answers they’ll ever get will be in terms of narrative criteria. But why, if people are prone to taking on roles, wouldn’t they take on something pleasant, like kings or princesses?

            Since our identities are made up of the stories we tell about ourselves—even to ourselves—it’s important that those stories be compelling. And if nothing ever goes wrong in the stories we tell, well, they’d be pretty boring. As Jonathan Gottschall writes in The Storytelling Animal: How Stories Make Us Human,

This need to see ourselves as the striving heroes of our own epics warps our sense of self. After all, it’s not easy to be a plausible protagonist. Fiction protagonists tend to be young, attractive, smart, and brave—all the things that most of us aren’t. Fiction protagonists usually live interesting lives that are marked by intense conflict and drama. We don’t. Average Americans work retail or cubicle jobs and spend their nights watching protagonists do interesting things on television. (171)

Listen to the ways talk show hosts like Oprah talk about mental disorders, and count how many times in an episode she congratulates the afflicted guests for their bravery in keeping up the struggle. Sometimes, the word hero is even bandied about. Troublingly, the people who cast themselves as heroes spreading awareness, countering stigmas, and helping people cope even like to do really counterproductive things like publishing lists of celebrities who supposedly suffer from the disorder in question. Think you might have bipolar? Kay Redfield Jameson thinks you’re in good company. In her book Touched By Fire, she suggests everyone from rocker Curt Cobain to fascist Mel Gibson is in that same boat-full of heroes.

            The reason medical researchers insist a drug must not only be shown to make people feel better but must also be shown to work better than a placebo is that even a sham treatment will make people report feeling better between 60 and 90% of the time, depending on several well-documented factors. What psychiatrists fail to acknowledge is that the placebo dynamic can be turned on its head—you can give people illnesses, especially mental illnesses, merely by suggesting they have the symptoms—or even by increasing their awareness of and attention to those symptoms past a certain threshold. If you tell someone a fact about themselves, they’ll usually believe it, especially if you claim a test, or an official diagnostic manual allowed you to determine the fact. This is how frauds convince people they’re psychics. An experiment you can do yourself involves giving horoscopes to a group of people and asking how true they ring. After most of them endorse their reading, reveal that you changed the labels and they all in fact read the wrong sign’s description.  

            Psychiatric diagnoses, to be considered at all valid, would need to be double-blind, just like drug trials: the patient shouldn’t know the diagnosis being considered; the rater shouldn’t know the diagnosis being considered; only a final scorer, who has no contact with the patient, should determine the diagnosis. The categories themselves are, however, equally problematic. In order to be properly established as valid, they need to have predictive power. Trials would have to be conducted in which subjects assigned to the prospective categories using double-blind protocols were monitored for long periods of time to see if their behavior adheres to what’s expected of the disorder. For instance, bipolar is supposedly marked by cyclical mood swings. Where are the mood diary studies? (The last time I looked for them was six months ago, so if you know of any, please send a link.) Smart phones offer all kinds of possibilities for monitoring and recording behaviors. Why aren’t they being used to do actual science on mental disorders?

            To research the role-playing dimension of mental illness, one (completely unethical) approach would be to design from scratch a really bizarre disorder, publicize its symptoms, maybe make a movie starring Mel Gibson, and monitor incidence rates. Let’s call it Puppy Pregnancy Disorder. We all know dog saliva is chock-full of gametes, right? So, let’s say the disorder is caused when a canine, in a state of sexual arousal of course, bites the victim, thus impregnating her—or even him. Let’s say it affects men too. Wouldn’t that be funny? The symptoms would be abdominal pain, and something just totally out there, like, say, small pieces of puppy feces showing up in your urine. Now, this might be too outlandish, don’t you think? There’s no way we could get anyone to believe this. Unfortunately, I didn’t really make this up. And there are real people in India who believe they have Puppy Pregnancy Disorder

Tuesday, June 12, 2012

Why Shakespeare Nauseated Darwin: A Review of Keith Oatley's "Such Stuff as Dreams"

Review of Such Stuff as Dreams: The Psychology of Fiction by Keith Oatley
            Late in his life, Charles Darwin lost his taste for music and poetry. “My mind seems to have become a kind of machine for grinding general laws out of large collections of facts,” he laments in his autobiography, and for many of us the temptation to place all men and women of science into a category of individuals whose minds resemble machines more than living and emotionally attuned organs of feeling and perceiving is overwhelming. In the 21st century, we even have a convenient psychiatric diagnosis for people of this sort. Don’t we just assume Sheldon in The Big Bang Theory has autism, or at least the milder version of it known as Asperger’s? It’s probably even safe to assume the show’s writers had the diagnostic criteria for the disorder in mind when they first developed his character. Likewise, Dr. Watson in the BBC’s new and obscenely entertaining Sherlock series can’t resist a reference to the quintessential evidence-crunching genius’s own supposed Asperger’s. In Darwin’s case, however, the move away from the arts couldn’t have been due to any congenital deficiency in his finer human sentiments because it occurred only in adulthood. He writes,

I have said that in one respect my mind has changed during the last twenty or thirty years. Up to the age of thirty, or beyond it, poetry of many kinds, such as the works of Milton, Gray, Byron, Wordsworth, Coleridge, and Shelley, gave me great pleasure, and even as a schoolboy I took intense delight in Shakespeare, especially in the historical plays. I have also said that formerly pictures gave me considerable, and music very great delight. But now for many years I cannot endure to read a line of poetry: I have tried lately to read Shakespeare, and found it so intolerably dull that it nauseated me. I have also almost lost my taste for pictures or music. Music generally sets me thinking too energetically on what I have been at work on, instead of giving me pleasure.

We could interpret Darwin here as suggesting that casting his mind too doggedly into his scientific work somehow ruined his capacity to appreciate Shakespeare. But, like all thinkers and writers of great nuance and sophistication, his ideas are easy to mischaracterize through selective quotation (or, if you’re Ben Stein or any of the other unscrupulous writers behind creationist propaganda like the pseudo-documentary Expelled, you can just lie about what he actually wrote). One of the most charming things about Darwin is that his writing is often more exploratory than merely informative. He writes in search of answers he has yet to discover. In a wider context, the quote about his mind becoming a machine, for instance, reads,

This curious and lamentable loss of the higher aesthetic tastes is all the odder, as books on history, biographies, and travels (independently of any scientific facts which they may contain), and essays on all sorts of subjects interest me as much as ever they did. My mind seems to have become a kind of machine for grinding general laws out of large collections of facts, but why this should have caused the atrophy of that part of the brain alone, on which the higher tastes depend, I cannot conceive. A man with a mind more highly organised or better constituted than mine, would not, I suppose, have thus suffered; and if I had to live my life again, I would have made a rule to read some poetry and listen to some music at least once every week; for perhaps the parts of my brain now atrophied would thus have been kept active through use. The loss of these tastes is a loss of happiness, and may possibly be injurious to the intellect, and more probably to the moral character, by enfeebling the emotional part of our nature.

His concern for his lost aestheticism notwithstanding, Darwin’s humanism, his humanity, radiates in his writing with a warmth that belies any claim about thinking like a machine, just as the intelligence that shows through it gainsays his humble deprecations about the organization of his mind.

           In this excerpt, Darwin, perhaps inadvertently, even manages to put forth a theory of the function of art. Somehow, poetry and music not only give us pleasure and make us happy—enjoying them actually constitutes a type of mental exercise that strengthens our intellect, our emotional awareness, and even our moral character. Novelist and cognitive psychologist Keith Oatley explores this idea of human betterment through aesthetic experience in his book Such Stuff as Dreams: The Psychology of Fiction. This subtitle is notably underwhelming given the long history of psychoanalytic theorizing about the meaning and role of literature. However, whereas psychoanalysis has fallen into disrepute among scientists because of its multiple empirical failures and a general methodological hubris common among its practitioners, the work of Oatley and his team at the University of Toronto relies on much more modest, and at the same time much more sophisticated, scientific protocols. One of the tools these researchers use, The Reading the Mind in the Eyes Test, was in fact first developed to research our new category of people with machine-like minds. What the researchers find bolsters Darwin’s impression that art, at least literary art, functions as a kind of exercise for our faculty of understanding and relating to others.
Keith Oatley

           Reasoning that “fiction is a kind of simulation of selves and their vicissitudes in the social world” (159), Oatley and his colleague Raymond Mar hypothesized that people who spent more time trying to understand fictional characters would be better at recognizing and reasoning about other, real-world people’s states of mind. So they devised a test to assess how much fiction participants in their study read based on how well they could categorize a long list of names according to which ones belonged to authors of fiction, which to authors of nonfiction, and which to non-authors. They then had participants take the Mind-in-the-Eyes Test, which consists of matching close-up pictures of peoples’ eyes with terms describing their emotional state at the time they were taken. The researchers also had participants take the Interpersonal Perception Test, which has them answer questions about the relationships of people in short video clips featuring social interactions. An example question might be “Which of the two children, or both, or neither, are offspring of the two adults in the clip?”  (Imagine Sherlock Holmes taking this test.) As hypothesized, Oatley writes, “We found that the more fiction people read, the better they were at the Mind-in-the-Eyes Test. A similar relationship held, though less strongly, for reading fiction and the Interpersonal Perception Test” (159).
Raymond Mar

            One major shortcoming of this study is that it fails to establish causality; people who are naturally better at reading emotions and making sound inferences about social interactions may gravitate to fiction for some reason. So Mar set up an experiment in which he had participants read either a nonfiction article from an issue of the New Yorker or a work of short fiction chosen to be the same length and require the same level of reading skills. When the two groups then took a test of social reasoning, the ones who had read the short story outperformed the control group. Both groups also took a test of analytic reasoning as a further control; on this variable there was no difference in performance between the groups. The outcome of this experiment, Oatley stresses, shouldn’t be interpreted as evidence that reading one story will increase your social skills in any meaningful and lasting way. But reading habits established over long periods likely explain the more significant differences between individuals found in the earlier study. As Oatley explains,

Readers of fiction tend to become more expert at making models of others and themselves, and at navigating the social world, and readers of non-fiction are likely to become more expert at genetics, or cookery, or environmental studies, or whatever they spend their time reading. Raymond Mar’s experimental study on reading pieces from the New Yorker is probably best explained by priming. Reading a fictional piece puts people into a frame of mind of thinking about the social world, and this is probably why they did better at the test of social reasoning. (160)

Connecting these findings to real-world outcomes, Oatley and his team also found that “reading fiction was not associated with loneliness,” as the stereotype suggests, “but was associated with what psychologists call high social support, being in a circle of people whom participants saw a lot, and who were available to them practically and emotionally” (160).

            These studies by the University of Toronto team have received wide publicity, but the people who should be the most interested in them have little or no idea how to go about making sense of them. Most people simply either read fiction or they don’t. If you happen to be of the tribe who studies fiction, then you were probably educated in a way that engendered mixed feelings—profound confusion really—about science and how it works. In his review of The Storytelling Animal, a book in which Jonathan Gottschall incorporates the Toronto team’s findings into the theory that narrative serves the adaptive function of making human social groups more cooperative and cohesive, Adam Gopnik sneers,

Surely if there were any truth in the notion that reading fiction greatly increased our capacity for empathy then college English departments, which have by far the densest concentration of fiction readers in human history, would be legendary for their absence of back-stabbing, competitive ill-will, factional rage, and egocentric self-promoters; they’d be the one place where disputes are most often quickly and amiably resolved by mutual empathetic engagement. It is rare to see a thesis actually falsified as it is being articulated.

Oatley himself is well aware of the strange case of university English departments. He cites a report by Willie van Peer on a small study he did comparing students in the natural sciences to students in the humanities. Oatley explains,

There was considerable scatter, but on average the science students had higher emotional intelligence than the humanities students, the opposite of what was expected; van Peer indicts teaching in the humanities for often turning people away from human understanding towards technical analyses of details. (160)

Oatley suggests in a footnote that an earlier study corroborates van Peer’s indictment. It found that high school students who show more emotional involvement with short stories—the type of connection that would engender greater empathy—did proportionally worse on standard academic assessments of English proficiency. The clear implication of these findings is that the way literature is taught in universities and high schools is long overdue for an in-depth critical analysis.

            The idea that literature has the power to make us better people is not new; indeed, it was the very idea on which the humanities were originally founded. We have to wonder what people like Gopnik believe the point of celebrating literature is if not to foster greater understanding and empathy. If you either enjoy it or you don’t, and it has no beneficial effects on individuals or on society in general, why bother encouraging anyone to read? Why bother writing essays about it in the New Yorker? Tellingly, many scholars in the humanities began doubting the power of art to inspire greater humanity around the same time they began questioning the value and promise of scientific progress. Oatley writes,

Part of the devastation of World War II was the failure of German citizens, one of the world’s most highly educated populations, to prevent their nation’s slide into Nazism. George Steiner has famously asserted: “We know that a man can read Goethe or Rilke in the evening, that he can play Bach and Schubert, and go to his day’s work at Auschwitz in the morning.” (164)
Willie van Peer

Postwar literary theory and criticism has, perversely, tended toward the view that literature and language in general serve as a vessel for passing on all the evils inherent in our western, patriarchal, racist, imperialist culture. The purpose of literary analysis then becomes to shift out these elements and resist them. Unfortunately, such accusatory theories leave unanswered the question of why, if literature inculcates oppressive ideologies, we should bother reading it at all. As van Peer muses in the report Oatley cites, “The Inhumanity of the Humanities,”

Consider the ills flowing from postmodern approaches, the “posthuman”: this usually involves the hegemony of “race/class/gender” in which literary texts are treated with suspicion. Here is a major source of that loss of emotional connection between student and literature. How can one expect a certain humanity to grow in students if they are continuously instructed to distrust authors and texts? (8)

           Oatley and van Peer point out, moreover, that the evidence for concentration camp workers having any degree of literary or aesthetic sophistication is nonexistent. According to the best available evidence, most of the greatest atrocities were committed by soldiers who never graduated high school. The suggestion that some type of cozy relationship existed between Nazism and an enthusiasm for Goethe runs afoul of recorded history. As Oatley points out,

Apart from propensity to violence, nationalism, and anti-Semitism, Nazism was marked by hostility to humanitarian values in education. From 1933 onwards, the Nazis replaced the idea of self-betterment through education and reading by practices designed to induce as many as possible into willing conformity, and to coerce the unwilling remainder by justified fear. (165)
Lynn Hunt

Oatley also cites the work of historian Lynn Hunt, whose book Inventing Human Rights traces the original social movement for the recognition of universal human rights to the mid-1700s, when what we recognize today as novels were first being written. Other scholars like Steven Pinker have pointed out too that, while it’s hard not to dwell on tragedies like the Holocaust, even atrocities of that magnitude are resoundingly overmatched by the much larger post-Enlightenment trend toward peace, freedom, and the wider recognition of human rights. It’s sad that one of the lasting legacies of all the great catastrophes of the 20th Century is a tradition in humanities scholarship that has the people who are supposed to be the custodians of our literary heritage hell-bent on teaching us all the ways that literature makes us evil.

            Because Oatley is a central figure in what we can only hope is a movement to end the current reign of self-righteous insanity in literary studies, it pains me not to be able to recommend Such Stuff as Dreams to anyone but dedicated specialists. Oatley writes in the preface that he has “imagined the book as having some of the qualities of fiction. That is to say I have designed it to have a narrative flow” (x), and it may simply be that this suggestion set my expectations too high. But the book is poorly edited, the prose is bland and often roles over itself into graceless tangles, and a couple of the chapters seem like little more than haphazardly collated reports of studies and theories, none exactly off-topic, none completely without interest, but all lacking any central progression or theme. The book often reads more like an annotated bibliography than a story. Oatley’s scholarly range is impressive, however, bearing not just on cognitive science and literature through the centuries but extending as well to the work of important literary theorists. The book is never unreadable, never opaque, but it’s not exactly a work of art in its own right.

            Insofar as Such Stuff as Dreams is organized around a central idea, it is that fiction ought be thought of not as “a direct impression of life,” as Henry James suggests in his famous essay “The Art of Fiction,” and as many contemporary critics—notably James Wood—seem to think of it. Rather, Oatley agrees with Robert Louis Stevenson’s response to James’s essay, “A Humble Remonstrance,” in which he writes that

Life is monstrous, infinite, illogical, abrupt and poignant; a work of art in comparison is neat, finite, self-contained, rational, flowing, and emasculate. Life imposes by brute energy, like inarticulate thunder; art catches the ear, among the far louder noises of experience, like an air artificially made by a discreet musician. (qtd on pg 8)

Oatley theorizes that stories are simulations, much like dreams, that go beyond mere reflections of life to highlight through defamiliarization particular aspects of life, to cast them in a new light so as to deepen our understanding and experience of them. He writes,

Every true artistic expression, I think, is not just about the surface of things. It always has some aspect of the abstract. The issue is whether, by a change of perspective or by a making the familiar strange, by means of an artistically depicted world, we can see our everyday world in a deeper way. (15)

Critics of high-brow literature like Wood appreciate defamiliarization at the level of description; Oatley is suggesting here though that the story as a whole functions as a “metaphor-in-the-large” (17), a way of not just making us experience as strange some object or isolated feeling, but of reconceptualizing entire relationships, careers, encounters, biographies—what we recognize in fiction as plots. This is an important insight, and it topples verisimilitude from its ascendant position atop the hierarchy of literary values while rendering complaints about clichéd plots potentially moot. Didn’t Shakespeare recycle plots after all?

            The theory of fiction as a type of simulation to improve social skills and possibly to facilitate group cooperation is emerging as the frontrunner in attempts to explain narrative interest in the context of human evolution. It is to date, however, impossible to rule out the possibility that our interest in stories is not directly adaptive but instead emerges as a byproduct of other traits that confer more immediate biological advantages. The finding that readers track actions in stories with the same brain regions that activate when they witness similar actions in reality, or when they engage in them themselves, is important support for the simulation theory. But the function of mirror neurons isn’t well enough understood yet for us to determine from this study how much engagement with fictional stories depends on the reader's identifying with the protagonist. Oatley’s theory is more consonant with direct and straightforward identification. He writes,

A very basic emotional process engages the reader with plans and fortunes of a protagonist. This is what often drives the plot and, perhaps, keeps us turning the pages, or keeps us in our seat at the movies or at the theater. It can be enjoyable. In art we experience the emotion, but with it the possibility of something else, too. The way we see the world can change, and we ourselves can change. Art is not simply taking a ride on preoccupations and prejudices, using a schema that runs as usual. Art enables us to experience some emotions in contexts that we would not ordinarily encounter, and to think of ourselves in ways that usually we do not. (118)

Much of this change, Oatley suggests, comes from realizing that we too are capable of behaving in ways that we might not like. “I am capable of this too: selfishness, lack of sympathy” (193), is what he believes we think in response to witnessing good characters behave badly.

            Oatley’s theory has a lot to recommend it, but William Flesch’s theory of narrative interest, which suggests we don’t identify with fictional characters directly but rather track them and anxiously hope for them to get whatever we feel they deserve, seems much more plausible in the context of our response to protagonists behaving in surprisingly selfish or antisocial ways. When I see Ed Norton as Tyler Durden beating Angel Face half to death in Fight Club, for instance, I don’t think, hey, that’s me smashing that poor guy’s face with my fists. Instead, I think, what the hell are you doing? I had you pegged as a good guy. I know you’re trying not to be as much of a pushover as you used to be but this is getting scary. I’m anxious that Angel Face doesn’t get too damaged—partly because I imagine that would be devastating to Tyler. And I’m anxious lest this incident be a harbinger of worse behavior to come.

            The issue of identification is just one of several interesting questions that can lend itself to further research. Oatley and Mar’s studies are not enormous in terms of sample size, and their subjects were mostly young college students. What types of fiction work the best to foster empathy? What types of reading strategies might we encourage students to apply to reading literature—apart from trying to remove obstacles to emotional connections with characters? But, aside from the Big-Bad-Western Empire myth that currently has humanities scholars grooming successive generations of deluded ideologues to be little more than culture vultures presiding over the creation and celebration of Loser Lit, the other main challenge to transporting literary theory onto firmer empirical grounds is the assumption that the arts in general and literature in particular demand a wholly different type of thinking to create and appreciate than the type that goes into the intricate mechanics and intensely disciplined practices of science.
Simon Baron-Cohen

As Oatley and the Toronto team have shown, people who enjoy fiction tend to have the opposite of autism. And people who do science are, well, Sheldon. Interestingly, though, the writers of The Big Bang Theory, for whatever reason, included some contraindications for a diagnosis of autism or Asperger’s in Sheldon’s character. Like the other scientists in the show, he’s obsessed with comic books, which require at least some understanding of facial expression and body language to follow. As Simon Baron-Cohen, the autism researcher who designed the Mind-in-the-Eyes test, explains, “Autism is an empathy disorder: those with autism have major difficulties in 'mindreading' or putting themselves into someone else’s shoes, imagining the world through someone else’s feelings” (137). Baron-Cohen has coined the term “mindblindness” to describe the central feature of the disorder, and many have posited that the underlying cause is abnormal development of the brain regions devoted to perspective taking and understanding others, what cognitive psychologists refer to as our Theory of Mind.

            To follow comic book plotlines, Sheldon would have to make ample use of his own Theory of Mind. He’s also given to absorption in various science fiction shows on TV. If he were only interested in futuristic gadgets, as an autistic would be, he could just as easily get more scientifically plausible versions of them in any number of nonfiction venues. By Baron-Cohen’s definition, Sherlock Holmes can’t possibly have Asperger’s either because his ability to get into other people’s heads is vastly superior to pretty much everyone else’s. As he explains in “The Musgrave Ritual,” “You know my methods in such cases, Watson: I put myself in the man’s place, and having first gauged his intelligence, I try to imagine how I should myself have proceeded under the same circumstances.”

            What about Darwin, though, that demigod of science who openly professed to being nauseated by Shakespeare? Isn’t he a prime candidate for entry into the surprisingly unpopulated ranks of heartless, data-crunching scientists whose thinking lends itself so conveniently to cooptation by oppressors and committers of wartime atrocities? It turns out that though Darwin held many of the same racist views as nearly all educated men of his time, his ability to empathize across racial and class divides was extraordinary. Darwin was not himself a Social Darwinist, a theory devised by Herbert Spencer to justify inequality (which has currency still today among political conservatives). And Darwin was also a passionate abolitionist, as is clear in the following excerpts from The Voyage of the Beagle:

On the 19th of August we finally left the shores of Brazil. I thank God, I shall never again visit a slave-country. To this day, if I hear a distant scream, it recalls with painful vividness my feelings, when passing a house near Pernambuco, I heard the most pitiable moans, and could not but suspect that some poor slave was being tortured, yet knew that I was as powerless as a child even to remonstrate.

Darwin is responding to cruelty in a way no one around him at the time would have. And note how deeply it pains him, how profound and keenly felt his sympathy is.

I was present when a kind-hearted man was on the point of separating forever the men, women, and little children of a large number of families who had long lived together. I will not even allude to the many heart-sickening atrocities which I authentically heard of;—nor would I have mentioned the above revolting details, had I not met with several people, so blinded by the constitutional gaiety of the negro as to speak of slavery as a tolerable evil.

            The question arises, not whether Darwin had sacrificed his humanity to science, but why he had so much more humanity than many other intellectuals of his day.

It is often attempted to palliate slavery by comparing the state of slaves with our poorer countrymen: if the misery of our poor be caused not by the laws of nature, but by our institutions, great is our sin; but how this bears on slavery, I cannot see; as well might the use of the thumb-screw be defended in one land, by showing that men in another land suffered from some dreadful disease.

And finally we come to the matter of Darwin’s Theory of Mind, which was quite clearly in no way deficient.

Those who look tenderly at the slave owner, and with a cold heart at the slave, never seem to put themselves into the position of the latter;—what a cheerless prospect, with not even a hope of change! picture to yourself the chance, ever hanging over you, of your wife and your little children—those objects which nature urges even the slave to call his own—being torn from you and sold like beasts to the first bidder! And these deeds are done and palliated by men who profess to love their neighbours as themselves, who believe in God, and pray that His Will be done on earth! It makes one's blood boil, yet heart tremble, to think that we Englishmen and our American descendants, with their boastful cry of liberty, have been and are so guilty; but it is a consolation to reflect, that we at least have made a greater sacrifice than ever made by any nation, to expiate our sin. (530-31)

            I suspect that Darwin’s distaste for Shakespeare was borne of oversensitivity. He doesn't say music failed to move him; he didn’t like it because it made him think “too energetically.” And as aesthetically pleasing as Shakespeare is, existentially speaking, his plays tend to be pretty harsh, even the comedies. When Prospero says, "We are such stuff / as dreams are made on" in Act 4 of The Tempest, he's actually talking not about characters in stories, but about how ephemeral and insignificant real human lives are. But why, beyond some likely nudge from his inherited temperament, was Darwin so sensitive? Why was he so empathetic even to those so vastly different from him? After admitting he’d lost his taste for Shakespeare, paintings, and music, he goes to say,

On the other hand, novels which are works of the imagination, though not of a very high order, have been for years a wonderful relief and pleasure to me, and I often bless all novelists. A surprising number have been read aloud to me, and I like all if moderately good, and if they do not end unhappily—against which a law ought to be passed. A novel, according to my taste, does not come into the first class unless it contains some person whom one can thoroughly love, and if a pretty woman all the better.
[Check out the Toronto group's blog at onfiction.ca]


Sunday, June 3, 2012

Bedtime Ghost Story for Adults

            I had just moved into the place on Berry Street with my girlfriend and her two cats. A very old lady lived in the apartment behind us. She came out to the dumpster while I was breaking down boxes and throwing them in. “Can you take those out?” she asked in her creaky voice. I explained I had nowhere else to put them if I did. “But it gets filled up and I can’t get anything in there,” she complained. I said she could come knock on our door if she ever had to throw something in the dumpster and it was too full. I’d help her.

            A couple nights later, just as we were about to go to bed, my girlfriend asked me to tell her a story. When we first started dating, I would improvise elaborate stories at her request—to impress her and because it was fun. I hadn’t done it in a while.

******

            “There was a couple who just moved into a new apartment,” I began as we climbed into bed.

            “Uh-huh,” she said, already amused.

            “And this apartment was at the front part of a really old house, and there was a really old lady who lived in the apartment behind theirs. Well, they got all their stuff moved in and they thought their place was really awesome and everything was going great. And the old lady liked the couple a lot… She liked them because she liked their cat.”

            “Oh, they have a cat, huh? You didn’t say anything about a cat.”

            “I just did.”

            “What color is this cat?”

            “Orange.”

            “Oh, okay.”

            “What happened was that one day the cat went missing and it turned out the cat had wandered to the old lady’s porch and she let it in her apartment. And she really liked it. But the girl was like, ‘Where’s my cat?’ and she went looking for it and got all worried. Finally, she knocked on the old lady’s door and asked if she’d seen it.

            “The old lady invited the girl in to give her her cat back and while they were talking the old lady was thinking, wow, I really like this girl and she has a really nice cat and I liked having the cat over here. And the old lady had grown up in New Orleans, so she and her sisters were all into voodoo and hoodoo and spells and stuff. They were witches.”

            “Oh man.”

            “Yeah, so the old lady was a witch. And since she liked the young girl so much she decided to do something for her, so while she was talking to her she had something in her hand. And she held up her hand and blew it in the girl’s face. It was like water and ashes or something. The girl had no idea what it was and she was really weirded out and like, ‘What the hell did she do that for?’ But she figured it was no big deal. The lady was really old and probably a little dotty she figured. But she still kind of hurried up and got her cat and went home.

            “Well, everything was normal until the boyfriend came home, and then the girl was all crazy and had to have sex with him immediately. They ended up having sex all night. And from then on it was like whenever they saw each other they couldn’t help themselves and they were just having sex all the time.”

            “Oh boy.”

            “Eventually, it was getting out of hand because they were both exhausted all day and they never talked to their friends and they started missing work and stuff. But they were really happy. It was great. So the girl started wondering if maybe the old lady had done something to her when she blew that stuff in her face. And then she thought maybe she should go and ask her, the old lady, if that’s what had happened. And if it was she thought, you know, she should thank her. She thought about all this for a long time, but then she would see the boyfriend and of course after that she would forget everything and eventually she just stopped thinking about it.

            “Then one day their cat went missing, their other cat.”

            “What color is this one?”

            “Black. And, since she found the other cat at the old lady’s before, the girl thought maybe she should go and ask the old lady again. So one day when she was getting home from work she saw the old lady sitting on her porch and she goes up to talk to her. And she’s trying to make small talk and tell the old lady about the cat and ask her if she’s seen it when the old lady turns around and, like, squints and wrinkles her nose and kind of goes like this—looking back—and says, ‘You didn’t even thank me!’ before walking away and going in her door.”

            “Ahh.”

            “Yeah, and the girl’s all freaked out by it too.”

            “Oh!—I’m gonna have to roll over and make sure she’s not out there.”

            “Okay… So the girl’s all freaked out, but she’s still like, ‘Where’s my cat?’ So one time after they just had sex for like the umpteenth time she tells her boyfriend we gotta find the cat. And the boyfriend is like, ‘All right, I’m gonna go talk to this old lady and find out what the hell happened to our cat.’”

            “Oh! What did you do to Mikey?”

            “I didn’t do anything. Just listen… Anyway, he’s determined to find out if the cat’s in this old lady’s apartment. So he goes and knocks on her door and is all polite and everything. But the old lady just says, ‘You didn’t even thank me!’ and slams the door on him. He doesn’t know what else to do at this point so he calls the police, and he tells them that their cat’s missing and the last time, when the other cat was missing, it turned up at the old lady’s house. And he told them the old lady was acting all weird and stuff too.

            “But of course the police can’t really do anything because there’s no way anyone knows the cat’s in the old lady’s house and they tell him to just wait and see if maybe the cat ran away or whatever. And the girl’s all upset and the guy’s getting all pissed off and trying to come up with some kind of scheme to get into the old lady’s house.

“–But they never actually get around to doing anything because they’re having so much sex and, even though they still miss the cat and everything, a lot of the time they almost forget about it. And it just goes on like this for a long time with the couple suspicious of the old lady and wondering where their cat is but not being able to do anything.

“And this goes on until one day—when the old lady just mysteriously dies. When the police get to her apartment, sure enough there’s the couple’s black cat.”

“Ooh, Mikey.”

“So the police come and tell the guy, you know, hey, we found your cat, just like you said. And the guy goes and gets the cat and brings it home. But while he’s in the old lady’s apartment he’s wondering the whole time about the spell she put on him and his girlfriend, and he’s a little worried that maybe since she died the spell might be broken. But he gets the cat and takes it home. And when his girlfriend comes home it’s like she gets all excited to see it, but only for like a minute, and then it’s like before and they can’t help themselves. They have to have sex.

“Well, this goes on and on and things get more and more out of hand until both of them lose their jobs, their friends just drift away because they never talk to them, and eventually they can’t pay their rent so they lose their apartment. So they get their cats and as much of their stuff as they can and they go to this spot they know by the river where some of their hippie friends used to camp. And they just live there like before, with their cats, just having sex all the time.

“One night after they just had sex again, they’re sitting by the campfire and the guy says, ‘You know, we lost our jobs and our friends and our apartment, and we’re living in the woods here by the river, and you’d think we’d be pretty miserable. But I think I have everything I need right here.’ He’s thinking about having sex again even as he’s saying this. And he’s like, ‘Really, I’m happy as hell. I don’t remember ever being this happy.’

“And the girl is like, ‘Yeah, me too. I actually kind of like living out here with you.’

“So they’re about to start having sex again when the black cat turns and looks at them and says, ‘And you didn’t even thank me!’”