Obedience | Ian Parker | Granta Magazine

Obedience

Ian Parker

Herbert Winer, who has not tortured anyone for nearly forty years, lives in New Haven, Connecticut, as he did in the early Sixties. He is a likeable, deadpan, Jack Lemmony sort of man of seventy-eight. When we met a little while ago, we took a walk on the Yale campus in New Haven and he led me downstairs into the basement of a neo-Gothic building, Linsly-Chittenden Hall. ‘It was cobwebby and dusty then,’ he said, ‘and it was a real mess, with temporary lights strung up and totally unfinished walls…’ We reached the bottom of the stairs and found ourselves in a disappointingly clean, neat, renovated corridor. There was nothing much to see, except a student pinboard carrying notices about eating disorders, God, and rooms to rent. One sheet on the board had a little fringe of tear-off email addresses: psychology experiment, it read; 50 minutes only. $8. Mr Winer let out a cry. ‘Oh!’ he said. ‘I don’t think that’s kept up with inflation. I should call them up and say, “I’ve been here before—cut it out!'”

Herbert Winer was last in this basement in the summer of 1961, when an observer on the Yale campus might have noticed unusual traffic in and out of Linsly-Chittenden Hall: a stream of New Haven residents, arriving for appointments an hour apart, and leaving, reddened and distraught, their composure lost. The same observer might have heard screams. Here, events were under way that would, in time, acquire a kind of mythical sheen. These screams and troubled looks would claim a place in Holocaust studies, in law and economics journals, in newspaper reports from Rwanda, in late-night debates in pubs and on Internet newsgroups. They would cross into fiction, into pop music and television drama. They would feature on The Simpsons. They would get under our skin. Here, a young Yale professor named Stanley Milgram was conducting what would become the most cited, celebrated—and reviled—experiment in the history of social psychology. Here, under the guise of a study into the influence of pain on learning, Professor Milgram was urging volunteers, Herbert Winer included, to give powerful electric shocks to a total stranger, a decent-seeming Irish-American man, who had done them no wrong. And despite the agonized protests of their amiable victim (in fact an actor, who received no electric shocks at all), Milgram’s volunteers were doing as they were told. They were pushing every switch in front of them, right past the one marked danger—severe shock, at which point the screaming man was falling eerily silent. People were willing to kill their neighbour. Clearly, then, in that summer nearly forty years ago, in Linsly-Chittenden Hall, Stanley Milgram was making a profound scientific discovery; he was unearthing something of lasting, shocking significance. Either that, or the young Herbert Winer was having his first experience of performance art.

 

In the spring of 1961, as Milgram was preparing for the experiment that would make his name and destroy his reputation, he made the journey from Yale to New York, met up with a new girlfriend, Alexandra Menkin, and took a little tour of Greenwich Village art galleries. ‘In one of them,’ he wrote to a friend a few days later, in an unpublished letter held at Yale, ‘we met some painters, and we all decided to applaud people as they passed on the street. It’s an amazingly effective way to draw people into the gallery, for as we stood there in the doorway applauding various passers-by, they were impelled to come over and ask why they were the object of so much enthusiasm. Thereupon we more or less shoved them into the gallery.’

His Greenwich Village girlfriend became his wife, and she now lives in the same apartment in Riverdale, New York, that she shared with her husband (who died in 1984), and their two children. There is a wide view of the Hudson river, and bold modern art bought on a honeymoon trip to South America. ‘He had a lot of energy,’ Alexandra Milgram said. ‘A very quick mind, a very good memory. Go into a museum with him, and you’d think you were just walking through to get to another part of the building, but he was taking in everything in the exhibit, every detail, and afterwards he would say, “Oh, did you notice this, did you notice that?” Where many people would stay and linger, he very seldom lingered.’ Mrs Milgram was once a dancer, and later trained as a social worker, giving help to Holocaust survivors. When we met, she had just become a grandmother for the first time, and she noted approvingly that her grandson had been given Stanley as a middle name rather than a first name. ‘I think it would be a burden to go around with the name Stanley Milgram,’ she said. ‘Too many people know.’

Stanley Milgram was an amateur librettist of musicals, a sketch artist, an amusing and lucid writer, a television producer manqué. He regarded himself as a Renaissance Man. He never closed the door to non-scientific pursuits. Growing up in the Bronx, the son of Jewish immigrants from Eastern Europe, he was active in his high school drama club, but also edited the school science magazine. That set the pattern for life. At university—first at Queens College, New York, then Harvard—he fell away from science to study political science, philosophy, music and art, then turned back again, impatient for ‘objective methods’. He finally had a career in a place somewhere between the objective and the subjective, at the ingenious, playful, street-theatre end of social psychology, the place in the American academy most likely to have a call for false beards and hidden microphones. (He was a great admirer of Candid Camera, a television show that became a network hit in 1960, around the time he was devising the obedience experiments for Linsly-Chittenden Hall.) He did a kind of science, but on at least one occasion, in Paris, his work—he called it ‘experimental invention’—was mistaken for conceptual art. He was perfectly serious, but because in later life he gave close attention to what people do in cities, and what cities do to people, his work sometimes has the flavour of stand-up comedy or Seinfeld storylines. (He did research into queue-barging, and how quickly one creates a crowd by pointing into the sky at nothing at all.) He relished scientific stage management—a role that suited a man keen to have an impact on his fellow Americans but not inclined to schmooze. Milgram could be awkward in company—sometimes caustic—but he wanted to be noticed. ‘My ideal experiment,’ he once wrote, describing work he had done into possible links between television violence and antisocial behaviour, ‘would have been to divide the country in half, remove all violence on television west of the Mississippi and include it east of the Mississippi, enforce laws that no one could move from one part of the country to the other, and then see what happens over a five- or ten-year period. It turned out not to be practical, so I had to work with what I had.’

The idea for Milgram’s most celebrated experimental invention came to him during the academic year 1959-60. He was at Princeton then, working for his mentor, the psychologist Solomon Asch, whose best-known work, a study of conformity and independence, was first described in 1951 in a paper called ‘Effects of Group Pressure upon the Modification and Distortion of Judgments’. In Asch’s experiment, a volunteer had been put in a room with people who appeared to be fellow volunteers, but who were in fact Asch’s confederates. The group was shown a number of lines drawn on pieces of card and asked to say which two were the same length; the volunteer was asked last. By arrangement, the confederates all gave the wrong answer. Frequently, so did the baffled volunteer. The volunteer gave an answer he could clearly see was wrong.

Milgram once described his revelatory moment to an interviewer from the magazine Psychology Today. ‘I was trying to think of a way to make Asch’s conformity experiment more humanely significant,’ he said. ‘I was dissatisfied that the test of conformity was judgements about lines. I wondered whether groups could pressure a person into performing an act whose human import was more readily apparent, perhaps behaving aggressively toward another person, say by administering increasingly severe shocks to him. But to study the group effect you would also need an experimental control; you’d have to know how the subject performed without any group pressure. At that instant, my thought shifted, zeroing in on this experimental control. Just how far would a person go under the experimenter’s orders? It was an incandescent moment, the fusion of a general idea on obedience with a specific technical procedure. Within a few minutes, dozens of ideas on relevant variables emerged, and the only problem was to get them all down on paper.’

In the autumn of 1960, Milgram left Princeton to become an assistant professor at Yale, and he took the idea with him. In November of that year, according to papers in the university library, Milgram was claiming on expenses for electrical switches. In the same files there are photocopied passages from Lord of the Flies, as well as a note in his handwriting about a railroad accountant he had met called James McDonough: ‘This man would be perfect as a victim. He is so mild and submissive.’ Also handwritten, there is something that looks like the first draft of a Jerry Lewis script: ’75. Ow. 90. Owch. 105. Ow. 110. Ow Hey! This really hurts. 135 OW—–150. Ow. That’s all!! Get me out of here.’

By the following summer, Milgram was ready to start: he had been awarded a grant of nearly $25,000 from the National Science Foundation, and he had secured the services of a crew-cut first-year graduate student in social psychology called Alan Elms. He had recruited two amateur actors, including the mild and submissive Mr McDonough, and scripts had been written for them. With some flair, Milgram had designed fake electrical equipment, and he had arranged to borrow the sociology department’s Interaction Laboratory in Linsly-Chittenden Hall. According to Elms, who now teaches in the psychology department of the University of California, Davis, Milgram also disclosed his intentions to the local police, for fear that someone might tell them about ‘this crazy man who was forcing them to do bad things’.

In July 1961 he placed an advertisement in the New Haven Register. It read:

We will pay five hundred New Haven men to help us complete a scientific study of memory and learning…No special training, education, or experience is needed…We want: factory workers, city employees, laborers, barbers, businessmen, clerks, professional people, telephone workers, construction workers, salespeople, white-collar workers, others…All persons must be between the ages of 20 and 50. High school and college students cannot be used… You will be paid $4.00 (plus 50c carfare) as soon as you arrive at the laboratory.

 

Herbert Winer, then an assistant professor of forestry at Yale, saw the advertisement. ‘Four dollars, fifty cents—it attracted me. That’s all I can say,’ Mr Winer told me on our walk. So he clipped the coupon and was accepted for the experiment, and a few weeks later kept his appointment in the cobwebby basement of Linsly-Chittenden Hall. A young man in a grey laboratory coat, an experimenter, was waiting for him. He checked Mr Winer’s name on a list, and shook his hand. ‘He was a neutral sort of man,’ Winer said. ‘He was someone out of a toothpaste ad. You know: “We’re just here to keep your teeth clean and free from decay.”‘ Another man, apparently a fellow volunteer, also appeared. He was a big amiable Irish-American man in glasses, eager to please. ‘He’s smiling. He says, “Hi, good to meet you.” He seemed like an ordinary chap like you might encounter on the street.’

Winer saw a big boxy machine on a desk; it had a horizontal line of thirty switches, ranging from fifteen volts to 450 volts. Above the switches was some printed text, reading slight shock at one end, and danger—severe shock towards the other, and beyond that, three crosses, xxx. ‘It was a voltage generator,’ said Winer. ‘Made by this outfit in Waltham, Massachusetts, which was a well-known area of electronics manufacturing. That label was not chosen by accident. It gave verisimilitude to the whole enterprise. You know, if it had been “Little Rock, Arkansas”, it wouldn’t have had the same weight. But “Waltham, Massachusetts”—that’s just where a psychological-shock generator could have been expected to originate.’

Speaking fast, the experimenter told Winer and the other man that they were participating in a study of ‘the effects of punishment on learning’. They were asked to choose from two slips of paper, to determine who would play the ‘teacher’ and who would play the ‘learner’. Mr Winer picked ‘teacher’. The experimenter then led them into a booth, off the main room, that contained a chair and an electric panel with four buttons. The chair, they were told, was wired to the generator. The ‘learner’—willing, smiling—was strapped into the chair, and electrodes were fixed to his wrists, so that he could just reach the line of buttons with his right hand.

The experimenter explained the rules of the experiment. Seated at the generator, in front of a microphone that carried his voice to the man in the booth, Winer was to read a list of word pairs: ‘strong arm’, ‘black curtain’ and so on. Then he was to go through the list again, this time giving the learner only the first word from each pair, along with four options to complete the pair: He would say ‘strong’, and then ‘back’; ‘arm’; ‘branch’; and ‘push’. The learner would choose what he thought was the right match by pressing one of the buttons, and Winer would see one of four lights come on in the main room. And now, still using an accelerated, featureless tone, the experimenter explained that if the learner gave a wrong answer, Winer would have to punish him with a jolt of electricity. Starting with fifteen volts, he was to increase the voltage for every mistake. ‘The ostensible idea,’ Winer explained, ‘was that maybe if he’s punished he’ll start paying more attention and—damn it—learn better, and stop thinking about next weekend’s ball game.’ The shocks might be painful, it was explained, but would cause ‘no permanent tissue damage’.

Winer was not given a chance to digest what he was hearing. ‘At no time during the experiment was there a pause or a break when anything could be raised,’ he said. The experimenter led Winer out of the booth back into the main room and they shut the door on the learner. Winer sat in front of the shock generator. The experimenter explained the routine for a wrong answer: the teacher was to say ‘wrong’, he was to read aloud the voltage about to be given, he was then to administer the punishment and give the correct word pair. ‘It is very important,’ he said, ‘that you follow this procedure exactly.’

The test began, and it was immediately clear that the learner was ‘not terribly bright. This guy was a very slow learner.’ He made his first error, and Mr Winer gave him fifteen volts.

‘Have you ever had a fifteen-volt shock?’ Winer asked me. ‘You would maybe notice a very mild tingling sensation, but nothing more than that. I’ve worked with electricity, and I know that fifteen, thirty, sixty volts, you’ll notice it, but when you get to over a hundred, it becomes not merely noticeable, but painful.’ The learner gave another wrong answer, and another, and Winer gradually began to move from shocks that he knew would cause tingling to those that he knew would cause pain. What, I asked, was he thinking? ‘I was just thinking: He didn’t get that. So up we go. Hope he does better next time.’

Before long, Winer heard reactions from the learner in the booth, behind the door—first groans, then spoken objections. Winer became uneasy. ‘There’s such a thing as retrospective wishful thinking,’ he told me. ‘But I would say it didn’t take very many groans and complaints before I started complaining.’ But the experimenter dismissed his objections, saying, ‘Please continue,’ and then, ‘The experiment requires that you continue.’ Winer was infuriated and his heart began to race, but for the moment he did as he was told.

The voltages increased, and now the learner was crying out in pain and pounding on the walls. At one point, he screamed: ‘You know, I’ve got a bad heart!’ And yet the experimental supervisor was unmoved. Winer was faced with a baffling contrast: on one hand, a distraught learner; on the other, an experimenter showing absolute calm. ‘There are some details I’ve forgotten,’ Winer told me. ‘But I’ll never forget that quiet voice: “The conditions of the experiment require that you continue.” He was just standing there with his arms folded, so to speak, saying, “I’m sorry, but the conditions of the experiment require that you continue.” But the guy’s getting hurt! He just kept repeating this mantra about the conditions of the experiment. I think I said, “I’ve done a few experiments in my day, but this is not fair,” something to that effect, but his response was unchanged. Finally I just got wound…like a spring that’s wound up to the point where it breaks. My heart rate was way, way up and I was feeling very annoyed and angry. Because I wanted to be obedient. I wanted to be cooperative. I’d forgotten about the four dollars and fifty cents. I knew a little about research and I didn’t want to screw things up, but at the same time, neither could I routinely inflict painful punishment… And finally, I just blew up, and said, “I’m sorry, I can’t continue.'”

The experiment was over. And now came the Candid Camera denouement. The learner walked back into the room, smiling, radiating good health and bonhomie, and Winer was told the truth. There had been no electric shocks. The learner had never been hurt; he had been in on the act. Winer learned the details later: the Irish-American man was, in fact, a railroad accountant, earning a dollar-seventy-five an hour to play his slow-witted role. The experimenter was not a lab technician, but a high-school teacher, earning two dollars an hour to wear a grey coat and a blank expression. The word ‘teacher’ had been written on both slips of paper.

Winer was debriefed, and the emphasis was on soothing words. He was told to have no misgivings about what had just occurred. ‘They said, “We appreciate you coming in, here’s the four dollars and fifty cents, and we hope that the results of the study will be of interest.”‘ Full disclosure was withheld, for fear of compromising future volunteers. So Winer had to work it out for himself: the experiment, clearly, had not been about ‘learning and memory’. It had tested the willingness of one ordinary man to inflict pain on another. It had been an experiment, he realized, about obedience.

Winer cannot remember at which voltage he stopped, and does not want to remember. ‘I’ve blotted that out of my memory for perfectly good and understandable reasons. If it was a hundred volts, which I kind of doubt, I’d be very gratified. But I suspect it was higher than that. If it was over 150 volts I would be very, very ashamed, and yet it might be.’ But he did disobey, eventually, and he has taken quiet strength from the fact for forty years. ‘It’s not something I pay much attention to. I’m not making big advertising campaigns about my disobedience, but I think it’s fair to say that had I not disobeyed I would not have mentioned the subject again.’

His immediate instinct was to confront Stanley Milgram. A few days later, Winer stormed into Milgram’s office, and told him—’very plainly’—of his objections. He said he had respect for the ingenuity of the experiment, and he had quickly had a hunch about its political foundations—’Milgram was very Jewish, I was Jewish. We talked about this. There was obviously a motive beyond neutral research’—but he was appalled by Milgram’s use of medically unscreened subjects. ‘I wasn’t shrieking at him, but I was very serious.’ Milgram, in reply, said he was sure his subjects would not suffer heart attacks.

Winer’s anger was partially fed, he now thinks, by embarrassment that he had been so successfully duped. But he is not surprised by his gullibility. ‘Today, people will say, “Surely you weren’t foolish enough to get taken in by that?” And the answer to that is, “Go back to the early Sixties, when the credibility index was far higher.” I was a credulous naive assistant professor in forestry, and I make no apologies for having been taken in. It’s very hard to re-establish the framework of—for want of a better word—innocence which prevailed. This is a psychology experiment! At Yale! My goodness—research! That’s very important. Research! That’s good with a capital G.’

Today, Winer has respect for the ‘dazzling ingenuity’ of the experiments, and he keeps up with the Milgram literature, but he is still troubled by the medical risks that he thinks were taken, and by the knowledge that a kind of innocence was lost, a line of trust between academia and the rest of the world. ‘There was the time,’ Winer told me, ‘when Milgram went into a classroom and interrupted it with the news that President Kennedy had just been shot in Dallas. And one of the students stood up and said, “Cut it out, Milgram!” Any other person, he would have got a still, hushed silence.’

 

Milgram was not fully prepared for his results. In early trial experiments, which he had conducted the previous winter, Yale undergraduates had moved up the shock board with worrying ease, but Milgram thought that the student body must be a skewed sample, and that the clunking prototype ‘shock generator’ he had used was not fully convincing. He felt sure that, with a broader sampling of subjects drawn from the general New Haven area, and with more sophisticated-looking props, the experiment would show greatly reduced compliance. (When, later, he described his procedure to psychiatrists and asked them to guess at the outcome, they imagined that only a psychopathic fringe would give the highest shock on the shock board.)

Milgram was expecting to have to squeeze compliance out of his subjects. He was then going to take his experiment abroad—to Germany, for example—and make cross-cultural comparisons. But he was overwhelmed by the results in New Haven. Milgram saw compliance, and it was ‘terrifying and depressing’. In the standard form of the experiment, where the ‘teacher’ could hear thumping, but no cries of pain, sixty-five per cent of volunteers continued past the switches on the machine that read danger—severe shock. (Beyond this point, the learner made no response at all, as if he had fallen unconscious. Teachers were told to regard no response as a wrong answer, and to continue.) When the experiment was set up so that, like Herbert Winer, the teachers could hear the learners demanding to be set free, sixty-two per cent still obeyed all the way. When the learner was in the same room, forty per cent were still fully compliant.

The assistant in the experiment, Alan Elms, spent much of the summer of 1961 standing behind a one-way mirror with Stanley Milgram, who sometimes brought a film camera. They were, at once, the spectacle’s audience and its stage directors. ‘It was a combination of keeping an eye on things—we were watching to make sure the experimenter and learner performed their roles right—and at the same time being amazed and at times appalled by a subject’s behaviour,’ Elms explained when I spoke to him. When the first few subjects went way up the shock board, Elms wondered if they were anomalies. Day after day, others did the same. Many were apparently anguished by their own actions, but they did as they were told. One agitated subject said, as he continued to press the 450 volt switch again and again, ‘What if he’s dead in there? I don’t get no answer, no noise.’

From their hidden vantage point, Elms and Milgram watched with both dismay and amusement. ‘We weren’t just sitting there and sweating and saying, “Oh my God.” We were occasionally laughing at the unexpected behaviour of the subjects, particularly at some of their remarks. There were certainly comments back and forth between us as some subjects went higher and higher on the scale, and at times we were making informal bets as to whether he’s going to go all the way.’

Over nine months, Milgram varied the conditions: the learner made more noise or less noise; he had a heart condition or he did not; he was invisible, or visible, to the teacher. At times, other actors became involved, and they urged the teacher on, or urged restraint. The experimenter was put in another room, at the end of a telephone. The two actors, playing victim and experimenter, switched roles, so the victim became a younger, leaner, less lovably Irish figure. After a few months, the experiment changed locations: having started in the grand, Yale-soaked surroundings of the Interaction Laboratory in Linsly-Chittenden Hall, it was forced to move to the under-furnished basement of the same building, where Winer had his appointment (Elms says that Milgram resented the upheaval, but grew to see the experimental advantage of shabbier surroundings). Eventually, the experiment moved to unprepossessing offices above a shop in Bridgeport, close to Yale but with no ostensible connection to the university. Some variations were rejected. According to Elms, Milgram also considered ways to use husband and wife teams as learner and teacher, but knew he might never be forgiven for that.

There were interesting variations in the results. Obedience was lessened by putting teacher and learner close together, and it was lessened—to a modest degree—by non-academic surroundings. The religion and gender of the volunteers seemed insignificant, but the more educated they were, the sooner they disobeyed. And in a curious commentary on the Asch research that had originally inspired Milgram, when a volunteer was placed between two fellow ‘teachers’ (in fact, actors), he found greater strength to disobey.

But there was nothing encouraging in the experiment’s central demonstration. The psychiatrists canvassed by Milgram were entirely wrong: it was not a psychopathic fringe that would push all thirty switches to the end of the board. According to Milgram’s experiments, a majority of Americans were willing to do so. In September 1961, just a few months into the experiment, Milgram wrote to his financial backers at the National Science Foundation: ‘In a naive moment some time ago,’ he told them, ‘I once wondered whether in all of the United States a vicious government could find enough moral imbeciles to meet the personnel requirements of a national system of death camps, of the sort that were maintained in Germany. I am now beginning to think that the full complement could be recruited in New Haven.’

 

Milgram had a world exclusive. He had caught evil on film. He had invented a kind of torture machine. But it was not immediately clear what he should do with his discovery. When he began the study, he had no theory, nor was he planning to test another man’s theory. His idea had sprung from contemplation of Solomon Asch, but the ‘incandescent’ moment at Princeton was a shift away from theory into experimental practice. He had had an idea for an experiment. Now, he was in an odd situation: he had caused something extraordinary to happen, but, technically, his central observation counted for nothing. With no provocation, a New Haven man had hit a fellow citizen with 450 volts. To the general observer, this will come as a surprise, but it is not a social scientific discovery, as Edward E. Jones, the distinguished editor of the Journal of Personality, made clear to Milgram when he declined the invitation to publish Milgram’s first paper. ‘The major problem,’ Jones wrote to Milgram, ‘is that this is really the report of some pilot research on a method for inducing stress or conflict…your data indicate a kind of triumph of social engineering…we are led to no conclusions about obedience, really, but rather are exhorted to be impressed with the power of your situation as an influence context.’ The Journal of Abnormal and Social Psychology also rejected the paper on its first submission, calling it a ‘demonstration’ rather than an experiment.

Milgram had described only one experimental situation. When he resubmitted the paper to the same journal, he now included experimental variables, and it was publishable. In the rewrite, Milgram put the emphasis on the way in which differences in situation had caused differences in degrees of obedience: the closer the learner to the teacher, the greater the disobedience, and so on. These details were later lost as the experiment moved out of social psychology into the larger world. But it could hardly have happened otherwise. The thought that people were zapping each other in a Yale laboratory is bound to be more striking than the thought that zapping occurs a little less often when one is looking one’s victim in the eye. The unscientific truth, perhaps, is that the central comparison in Milgram’s study is not between any two experimental variables: it is between what happened in the laboratory, and what we thought would happen. The experimental control in Milgram’s model is our hopelessly flawed intuition.

‘Somehow,’ Milgram told a friend in 1962, ‘I don’t write as fast or as easily as I run experiments. I have done about all the experiments I plan to do on Obedience, am duly impressed with the results, and now find myself acutely constipated.’ Milgram found it hard to knock the experiment into social scientific shape. It would be another decade before he incorporated his findings into a serious theory of the sources of human obedience. When he did so, in the otherwise absorbing and beautifully written book Obedience to Authority (1974), his thoughts about an ‘agentic state’—a psychological zone of abandoned autonomy—were not widely admired or developed by his peers, not least because they were so evidently retrospective. Most readers of Obedience to Authority are more likely to take interest in the nods of acknowledgement made to Arthur Koestler’s The Ghost in the Machine, and to Alex Comfort, the English anarchist poet, novelist, and author of The Joy of Sex. Most readers will take more pleasure—and feel Milgram took more pleasure—in the novelistic and strikingly unscientific descriptions of his experimental subjects. (‘Mrs Dontz,’ he wrote, ‘has an unusually casual, slow-paced way of speaking, and her tone expresses constant humility; it is as if every assertion carries the emotional message: “I’m just a very ordinary person, don’t expect a lot from me.” Physically, she resembles Shirley Booth in the film Come Back, Little Sheba‘)

But while Milgram was struggling to place his findings in a proper scientific context, they seemed to have found a natural home elsewhere. Stanley Milgram—a young social psychology professor at the start of his career—appeared to be in a position to contribute to one of the late twentieth century’s most pressing intellectual activities: making sense of the Holocaust. Milgram always placed the experiments in this context, and the figure of Adolf Eichmann, who was seized in Buenos Aires in the spring of 1960, and whose trial in Jerusalem began a year later, loomed over his proceedings. (In a letter that urged Alan Elms to keep up the supply of experimental volunteers, Milgram noted that this role bore ‘some resemblance to Mr Eichmann’s position’.) The trial, as Peter Novick has recently written in The Holocaust in American Life, marked ‘the first time that what we now call the Holocaust was presented to the American public as an entity in its own right, distinct from Nazi barbarism in general’. When Milgram published his first paper on the obedience studies in 1963, Hannah Arendt’s articles about the trial had just appeared in the New Yorker, and in her book, Eichmann in Jerusalem, and they had given widespread currency to her perception about ‘the banality of evil’. Milgram put Eichmann’s name in the first paragraph of his first obedience paper, and so claimed a place in a pivotal contemporary debate. His argument was this: his study showed how ordinary people are surprisingly prone to destructive obedience; the crimes of the Holocaust had been committed by people obeying orders; those people, therefore, could now be thought ordinary. The argument had its terrifying element and its consoling element: according to Milgram, Americans had to see themselves as potential murderers; at the same time we could understand Nazis to be no more unusual than any New Haven guy in a check shirt.

It may seem bizarre now: Milgram returned to ordinary Nazis their Nuremberg defence, nicely polished in an American laboratory. But the idea struck a chord, and news quickly spread of Milgram’s well-meaning, all-American torturers. ‘Once the [Holocaust] connection was in place,’ said Arthur G. Miller, a leading Milgram scholar, ‘then the experiments took on a kind of a larger-than-life quality.’ Milgram’s work was reported in the New York Times (65% in test blindly obey order to inflict pain), and the story was quickly picked up by Life, Esquire, ABC television, UPI and the British press. The fame of the experiments spread, and as the Sixties acquired their defining spirit, Holocaust references were joined by thoughts of My Lai; this was a good moment in history to have things to say about taking orders. By the time Milgram had published his book and released a short film of the experiment, his findings had spread into popular culture, and into theological, medical, and legal discussions. Thomas Blass, a social psychologist at the University of Maryland, Baltimore County, who is preparing a Milgram biography, has a large collection of academic references, including a paper in the context of accountancy ethics. (Is it unthinking obedience that causes accountants to act unlawfully on behalf of clients?) Outside the academy, Dannie Abse published an anti-Milgram play, The Dogs of Pavlov, in 1973, and two years later, in America, CBS broadcast a television movie, The Tenth Level, that made awkward melodrama out of the obedience experiments, and starred William Shatner as a spookily obsessed and romantically disengaged version of Professor Milgram. (‘You may know your social psychology, Professor, but you have a lot to learn about the varieties of massage.’) Peter Gabriel sang ‘We Do What We’re Told (Milgram’s 37)’ in 1986. And there would be more than a whiff of Milgram in the 1990 episode of The Simpsons, ‘There’s No Disgrace Like Home’, in which the family members repeatedly electrocute one another until the lights across Springfield flicker and dim. Last year, ‘The Stanley Milgram Experiment’—a comedy sketch duo—made its off-off-Broadway debut in New York. Robbie Chafitz, one of the pair, had been startled and amused by the Milgram film as a teenager, and had always vowed to use the name one way or another. Besides, as he told me, ‘anything with electricity and people is funny’.

But however celebrated the experiments became, there was a question they could never shake off. It was an ethical issue: had Stanley Milgram mistreated his subjects? Milgram must have seen the storm coming, at least from the moment when Herbert Winer marched into his office, talking of heart attacks. In the summer of 1962, other subjects recorded their feelings about the experiment in response to a questionnaire sent out by Milgram along with a report explaining the true purpose of the experiment. Replies were transferred on to index cards and are now held—unpublished and anonymous—at Yale. ‘Since taking part in the experiment,’ reads one card, ‘I have suffered a mild heart attack. The one thing my doctor tells me that I must avoid is any form of tension.’ Another card: ‘Right now I’m in group therapy. Would it be OK if I showed this report to [the] group and the doctors at the clinic?’

Since then, the experiment has been widely attacked from within the profession and from outside. To many, Milgram became a social psychological demon; Alan Elms has met people at parties who have recoiled at the news that he was a Milgram lieutenant. The psychologist Bruno Bettelheim described Milgram’s work as ‘vile’ and ‘in line with the human experiments of the Nazis’. In his defence, Milgram would always highlight the results of post-experimental psychological studies—which had reported ‘no evidence of any traumatic reactions’—and the fact of the debriefings in Linsly-Chittenden Hall, in which care had been taken to give obedient subjects reasons not to feel bad about themselves. They were told to remember, for example, that doctors routinely hurt people in a thoroughly good cause. (Alan Elms wonders if this debriefing was too effective, and that subjects should have been obliged to confront their actions more fully.)

But Milgram never quite won the ethical argument. And the controversy was immediately damaging to his career. Someone—perhaps a Yale colleague, according to Thomas Blass—quickly brought the experiment to the attention of the American Psychological Association, and Milgram’s application for APA membership was delayed while the case against him was considered. Today, although the APA is happy to include Milgram’s shock generator in a travelling psychology exhibition, it is careful to describe the experiments as ‘controversial’ in its accompanying literature. As the APA points out, modern ethical guidelines (in part inspired by Milgram) would prevent the obedience studies from being repeated today.

The controversy followed him. In 1963 Milgram left Yale for Harvard. He was happy there. This is where his two children were born. And when a tenured job came up, he applied. But he needed the unanimous support of his colleagues, and could not secure it. He was blackballed by enemies of the obedience work. (According to Alexandra Milgram, her husband once devised a board game based on the tenure of university professors.) The late Roger Brown, a prominent Harvard psychologist, told Thomas Blass that there had been those in the department who thought of Milgram as ‘sort of manipulative, or the mad doctor. They felt uneasy about him.’

So in 1967 Stanley Milgram left Harvard to become head of the social psychology programme in the psychology department in the Graduate Center of the City University of New York (CUNY). In one sense, it was a promotion; he was a full professor at thirty-three. ‘But after Yale and Harvard, it was the pits,’ said Milgram’s friend and fellow social psychologist, Philip Zimbardo. ‘Most people I know who didn’t get tenure, it had a permanent effect on their lives. You don’t get to Yale or Harvard unless you’ve been number one from kindergarten on, you’ve been top—so there’s this discontinuity. It’s the first time in your life you’ve failed. You’re Stanley Milgram, and people all over the world are talking about your research, and you’ve failed.’ Milgram was the most cited man in social psychology—Roger Brown, for example, considered his research to be of ‘profound importance and originality’—yet in later life, he was able to tell Zimbardo that he felt under-appreciated.

 

The ethical furore preyed on Milgram’s mind—in the opinion of Arthur G. Miller, it may have contributed to his premature death—but one of its curious side effects was to reinforce the authenticity of his studies in the world outside psychology departments. Among those with a glancing knowledge of Milgram, mistreatment of experimental subjects became the only Milgram controversy. The studies remained intellectually sound, a minor building block of Western thought, a smart conversational gambit at cocktail parties. ‘People identified the problem with Milgram as just a question of ethics,’ says Henderikus Stam, of the University of Calgary in Canada, who trained as a social psychologist, but who lost faith and is now a psychological theoretician and historian. ‘So in a way people never got beyond that. Whereas there’s a deeper epistemological question, which is: what can we actually know when we’ve done an experiment like that, what are we left with? What have we learned about obedience?’

Within the academy, there was another, quieter, line of criticism against Milgram: this was methodological. In a paper in 1968 the social psychologists Martin Orne and Charles Holland raised the issue of incongruity, pointing out that Milgram’s subjects had been given two key pieces of information: a man in apparent danger, and another man—a man in a lab coat—whose lack of evident concern suggested there was no danger. It seemed possible that obedient subjects had believed in the more plausible piece of information (no danger), and thus concluded, at some conscious or semi-conscious level, that the experiment was a fake, and—in a ‘pact of ignorance’—been generous enough to role-play for the sake of science. In other words, they were only obeying the demands of amateur dramatics.

Perhaps forgetting that people weep in the theatre, Milgram’s response was to argue that the subjects’ signs of distress or tension—the twitching and stuttering and racing heartbeats—could be taken as evidence that they had accepted the experiment’s reality. He also drew upon the questionnaire he had sent out in 1962, in which his volunteers—now entirely in the know—had been asked to agree with one of five propositions, running from, ‘I fully believed the learner was getting painful shocks’ to ‘I was certain the learner was not getting the shocks’. Milgram was pleased to note that three-quarters of the subjects said they believed the learner was definitely or probably getting the shocks. (He added, reasonably, ‘It would have been an easy out at this point to deny that the hoax had been accepted.’)

Herbert Winer reports that he was fully duped, and Alan Elms told me that, watching through the mirror during the summer of 1961, he saw very little evidence of widespread disbelief. But it is worth pointing out that Milgram could have reported his questionnaire statistics rather differently. He could have said that only fifty-six per cent accepted his first proposition: ‘I fully believed the learner was getting painful shocks’. Forty-four per cent of Milgram’s subjects claimed to be at least partially unpersuaded. (Indeed, on his own questionnaire, Winer said he had some doubts.) These people do not have much of a presence in Milgram’s writings, but you catch a glimpse of them in the Yale Library index cards. One reads: ‘I was quite sure “grunts and screams” were electrically reproduced from a speaker mounted in [the] students’ room.’ (They were.) ‘If [the learner] was making the sounds I should have heard the screams from under the door—which was a poorly fit [sic] thin door. I’m sorry that I didn’t have enough something to get up and open this door. Which was not locked. To see if student was still there.’ On another card: ‘I think that one of the main reasons I continued to the end was that…I just couldn’t believe that Yale would concoct anything that would be [as] dangerous as the shocks were supposed to be.’ Another subject had noticed how the experimenter was watching him rather than the learner. Another hadn’t understood why he was not allowed to volunteer to be the learner. And another wrote, ‘I had difficulty describing the experiment to my wife as I was so overcome with laughter—haven’t had such a good laugh since the first time I saw the 4 Marx Bros—some 25 years ago.’

For an experiment supposed to involve the undeserved torture of an innocent Irish-American man, there was a lot of laughter in Yale’s Interaction Laboratory. Frequently, Milgram’s subjects could barely contain themselves as they moved up the shock board (‘On one occasion,’ Milgram later wrote, ‘we observed a seizure so violently convulsive that it was necessary to call a halt to the experiment.’) Behind their one-way mirror, Milgram and Elms were at times highly amused. And when students are shown the Milgram film today, there tends to be loud laughter in the room. People laugh, and—despite the alleged revelation of a universal heart of darkness—they go home having lost little faith in their friends and their families.

According to Henderikus Stam, the laughter of the students, and perhaps that of the subjects, is a reasonable response to an absurd situation. It’s a reaction to the notion that serious and complex moral issues, and the subtleties of human behaviour, can reasonably be illuminated through play-acting in a university laboratory. The experiment does nothing but illuminate itself. ‘What it does is it says, “Aren’t we clever?” If you wanted to demonstrate obedience to authority wouldn’t you be better showing a film about the Holocaust, or news clips about Kosovo? Why do you need an experiment, that’s the question? What does the experiment do? The experiment says that if we really want to know about obedience to authority we need an abstract representation of that obedience, removed from all real forms of the abuse of authority. But what we then do is to use that representation to refer back to the real historical examples.’

What happens when we refer back to historical examples? Readers of Hitler’s Willing Executioners, Daniel Jonah Goldhagen’s study of the complicity of ordinary German citizens in the Holocaust, will learn within one paragraph of a German policeman, Captain Wolfgang Hoffmann, a ‘zealous executioner of Jews’, who ‘once stridently disobeyed a superior order that he deemed morally objectionable’. The order was that he and members of his company should sign a declaration agreeing not to steal from Poles. Hoffmann was affronted that anyone would think the declaration necessary, that anyone would imagine his men capable of stealing. ‘I feel injured,’ he wrote to his superiors, ‘in my sense of honour.’ The genocidal killing of thousands of Jews was one thing, but plundering from Poles was another. Here was an order to which he was opposed, and which he felt able to disobey.

Goldhagen is impatient with what he calls ‘the paradigm of external compulsion’, which sets the actions of the Holocaust’s perpetrators in the context of social-psychological or totalitarian state forces. His book aims to show how the crimes of the Holocaust were carried out by people obeying their own consciences, not blindly or fearfully obeying orders. ‘If you think that certain people are evil,’ he told me, ‘and that it’s necessary to do away with them—if you hate them—and then someone orders you to kill them, you’re not carrying out the deed only because of the order. You’re carrying it out because you think it’s right. So in all those instances where people are killing people they hate—their enemies or their perceived enemies—then Milgram is just completely inapplicable.’

Goldhagen wonders if the Milgram take on the Holocaust met a particular need, during the Cold War, for America’s new German allies ‘to be thought well of’. He also wonders if, by robbing people of their agency, ‘of the fact that they’re moral beings’, the experiment tapped into the kind of reductive universalism by which, he says, Americans are easily seduced—the belief that all men are created equal, and in this case equally obedient. Goldhagen has no confidence in the idea that Milgram was measuring obedience at all. The experimental conditions did not properly control for other variables, such as trust, nor did they allow for the way decisions are made in the real world—over time, after consultation. Besides, said Goldhagen, in a tone close to exasperation, ‘people disobey all the time! Look around the world. Do people always pay all their taxes? Do what their bosses tell them? Or quietly accept what any government decides? Even with all kinds of sanctions available, one of the greatest problems that institutions face is to get their members to comply with rules and orders.’ Milgram’s findings, he says, ‘are roundly, repeatedly and glaringly falsified by life’.

In the opinion of Professor Stam, this comes close to defining the problems of social psychology itself. It is a discipline, he says, that makes the peculiar claim that ‘if you want to ask questions about the social world, you have to turn them into abstract technical questions’. The Milgram experiment, he says, ‘has the air of scientificity about it. But it’s not scientific, it’s…scientistic.’

 

And there is Milgram’s problem: he devised an intensely powerful piece of tragicomic laboratory theatre, and then had to smuggle it into the faculty of social science. His most famous work—which had something to say about trust, embarrassment, low-level sadism, willingness to please, exaggerated post-war respect for scientific research, the sleepy, heavy-lidded pleasure of being asked to take part, and, perhaps, too, the desire of a rather awkward young academic to secure attention and respect—had to pass itself off as an event with a single, steady meaning. And that disguise has not always been convincing. It’s odd to hear Arthur G. Miller—one of the world’s leading Milgram scholars—acknowledge that there have been times when he has wondered, just for a moment, if the experiments perhaps mean nothing at all.

But the faculty of social psychology is not ready to let Milgram go. And there may be a new way to rescue the experiments from their ungainly ambiguity. This is the route taken by Professors Lee Ross and Richard E. Nisbett (at Stanford and the University of Michigan respectively), whose recent synthesis of social psychological thinking aims to give the subject new power. According to Professor Ross, the experiments may be ‘performance’, but they still have social psychological news to deliver. If that is true, then we can do something that the late professor was not always able to do himself: we can make a kind of reconciliation between the artist and the scientist in Stanley Milgram.

Ross and Nisbett find a seat for Stanley Milgram at social psychology’s high table. They do this slyly, by taking the idea of obedience—Milgram’s big idea—and putting it quietly to one side. When Ross teaches Milgram at Stanford, he makes a point of giving his students detailed instructions on how to prepare for the classes—instructions that he knows will be thoroughly ignored. He is then able to stand in front of his students and examine their disobedience. ‘I asked you to do something that’s good for you rather than bad for you,’ he tells them. ‘And I’m a legitimate authority rather than an illegitimate one, and I actually have power that the Milgram experimenter doesn’t have. And yet you didn’t obey. So the study can’t just be about obedience.’ What it is primarily about, Ross tells his students—and it may be about other things too—is the extreme power of a situation that has been built without obvious escape routes. (As Herbert Winer said: ‘At no time was there a pause or a break when anything could be raised…’) ‘There was really no exit,’ Ross told me, ‘there was no channel for disobedience. People who were discomforted, who wanted to disobey, didn’t quite know how to do it. They made some timid attempts, and it got them nowhere. In order to disobey they have to step out of the whole situation, and say to the experimenter, “Go to hell! You can’t tell me what to do!” As long as they continue to function within that relationship, they’re asking the experimenter for permission not to give shocks, and as long as the experimenter denies them that permission, they’re stuck. They don’t know how to get out of it.’ Ross suspects that things would have turned out very differently given one change to the situation. It’s a fairly big change: the addition of a prominent red button in the middle of the table, combined with a clearly displayed notice signed by the ‘Human Subjects’ Committee’ explaining that the button could be pressed ‘by any subject in any experiment at any time if he or she absolutely refuses to continue’.

According to Ross and Nisbett (who are saying something that Milgram surely knew, but something he allowed to become obscured), the Obedience Experiments point us towards a great social psychological truth, perhaps the great truth, which is this: people tend to do things because of where they are, not who they are, and we are slow to see it. We look for character traits to explain a person’s actions—he is clever, shy, generous, arrogant—and we stubbornly underestimate the influence of the situation, the way things happened to be at that moment. So, if circumstances had been even only subtly different (if she hadn’t been running late; if he’d been told the film was a comedy), the behaviour might have been radically different. Under certain controlled circumstances, then, people can be induced to behave unkindly: to that extent, Milgram may have something to say about a kind of destructive obedience. But under other circumstances, Professor Ross promised me, the same people would be nice. Given the correct situation, he said, we could be led to do ‘terrifically altruistic and self-sacrificing things that we would never have agreed to before we started’.

So the experiment that has troubled us for nearly forty years (that buzzing and howling), and which caused Milgram to have dark thoughts about America’s vulnerability to fascism, suddenly has a new complexion. Now, it is about the influence of any situation on behaviour, good or bad: ‘You stop on the highway to help someone,’ Professor Ross said, ‘and then the help you try to give doesn’t prove to be enough, so you give the person a ride, and then you end up lending them money or letting them stay in your house. It wasn’t because that was the person in the world you cared about the most, it was just one thing led to another. Step by step.’

That’s the Milgram situation. ‘We can take ordinary people,’ Ross said, ‘and make them show a degree of obedience or conformity—or for that matter altruism or bravery, whatever—to a degree that we would normally assume you would only see in the rare few. And that’s relevant to telling us what we’re capable of making people do, but it also tells us that when we observe the world, we are often going to be making an attribution error, because lots of times, the situational factors have been opaque to us, and therefore we are making erroneous inferences about people. The South African government says, “Can we deal with this fellow Mandela?” and the answer is, “No, he’s a terrorist.” But a social psychologist would say, “Mandela, in one context, given one set of situations, was a terrorist.”‘ According to Ross, that’s the key lesson of social psychology; that’s how the discipline can be useful in education, the work place, and law. ‘Our emphasis,’ he says, ‘should be on creating situations that promote what we want to promote, rather than searching endlessly for the right person. Don’t assume that people who commit atrocities are atrocious people, or people who do heroic things are heroic. Don’t get overly carried away; don’t think, because you observed someone under one set of discrete situational factors, that you know what they’re like, and therefore can predict what they would do in a very different set of circumstances.’

 

It’s hard not to think of Stanley Milgram in another set of circumstances—to imagine the careers he did not have in films or in the theatre, and to wonder how things would have turned out if his work had appeared at another time, or had been read a little differently. It may now be possible to place the Obedience Experiments somewhere near the centre of the social psychological project, but that’s not how it felt in the last years of Milgram’s life. He had failed to secure tenure at Harvard. Disappointed, he moved to New York, assuming he would soon be leaving again, to take up a post at a more glamorous institution. But he was still at CUNY seventeen years later, at the time of his premature death. ‘He had hoped it would be just for five years,’ Alexandra Milgram told me, ‘But things got much more difficult to move on to other places. You were glad to have what you had. And he was happy to do the work that he did. I don’t think he was as happy at the university as he was at, say, Harvard, but he was a very independent person: he had his ideas, he had his research.’

The research pushed Milgram into a kind of internal exile. Confirming his reputation as social psychology’s renegade, he pursued work that, although often brilliantly conceived and elegantly reported, could look eccentric and old-fashioned to colleagues, and that ran the risk of appearing to place method ahead of meaning. ‘It would flash and then burn out,’ says Professor Miller, ‘and then he’d go on to something else.’ He sent his (young, able-bodied) students on to the New York subway to ask people to give up their seats. He co-wrote a paper about Candid Camera‘s virtues as an archive for students of human behaviour. Pre-empting the play Six Degrees of Separation, he studied the ‘small world’ phenomenon, investigating the chains of acquaintance that link two strangers. He took photographs of rail commuters and showed them to those who travelled on the same route, to explore the notion of the ‘familiar stranger’. In an expensive, elaborate, and ultimately inconclusive experiment in 1971, he explored the links between antisocial acts seen on television and similar acts in real life by getting CBS to produce and air two versions of a hit hospital drama, Medical Center. He asked students to try to give away money on the street. He tested how easy it was for people to walk between a pavement photographer and his subject. And when he was recuperating from one of a series of heart attacks, he made an informal study of the social psychology of being a hospital patient. He was only fifty-one when he died.

Once, shortly before the Obedience Experiments had begun, Milgram had written from Yale about his fear of having made the wrong career move. ‘Of course,’ he told a friend, ‘I am glad that the present job sometimes engages my genuine interests, or at least, a part of my interests, but there is another part that remains submerged and somehow, perhaps because it is not expressed, seems most important.’ He described his routine: pulling himself out of bed, dragging himself to the lecture room ‘where I misrepresent myself for two hours as an efficient and persevering man of science… I should not be here, but in Greece shooting films under a Mediterranean sun, hopping about in a small boat from one Aegean isle to the next.’ He added, in a spirit of comic self-laceration, ‘Fool!’

 

Image © Isabelle

Ian Parker

Ian Parker is a British writer living in New York and is a staff writer for the New Yorker.

More about the author →