Master's Series on Field Research A series of interviews with major figures in field research conducted in the early 1980s by Peter Blanck Transcript of an interview with Dave McClella Peter Blanck: This is Dave McClelland, on January 29th, 1982. And we'll just start with some of your ideas about field research that you just mentioned - what is appropriate and so forth? Dave McClelland: Well, I think the easiest way to describe when field research is appropriate, and important to do is to talk about a project of mine which illustrates almost every aspect of why you get into field research and how you go about doing it, and what kinds of problems you run into when you do do it. I had been working for a number of years on what started out as strictly a laboratory problem. In this respect, I'm a typical psychologist: I like to work in a small compass, in a laboratory, finding out what relationships between variables are when I've got everything under control, which is the opposite of field research. In this case, I was working with a motivational disposition, which we called the need to achieve, or "n-ach," or need for achievement; and we've done a lot of lab research which we had discovered that if we measured it in a certain way- by content analysis of thought samples, or written stories that people wrote- if we measured it that way, we could predict that people who scored high in the need to achieve, or whose stories were loaded with achievement imagery, behaved in very certain very characteristic ways. And that's the traditional psychological approach, that is, you take the people that are high in the need, and the people who are low in the need, and you compare and contrast them on a dozen different characteristics. In the course of doing this, we discovered that they had certain characteristics that should suit them to be good entrepreneurs, that is, they took moderate risks. Typically, they preferred moderately difficult tasks, the very easy ones, or very difficult ones. They were interested in feedback on how well they were doing; they like to take personal responsibility for what they did, and so on. And these seemed to be precisely the characteristic that good entrepreneurs should have. And at this point, we kind of linked it up with Max Weber's theory of A Protestant Ethic and the Rise of the Spirit of Capitalism, because he had described a new spirit, as he called it, of capitalism that that new Protestants were infusing into business enterprise. And his description sounded very much like people who are high in need to achieve, or high in n-ach. Well, we have worked out this theory very, very carefully. We had discovered- first of all, we had moved in the field in a small way- we had discovered that heads of small businesses who were high on the need to achieve, and in fact, were better businessmen than people low in the need to achieve. You could say that was a kind of checking in the field of something that you had, of a hypothesis that you had developed in the laboratory, namely that people that are high on the need to achieve would make good entrepreneurs. But the project I want to describe went far beyond that. In my book, The Achieving Society, I argued further that if a country as a whole, or a whole group of people are high on the need to achieve, you would turn up a lot of these better businessmen, and they, collectively, would produce a more rapid rate of economic growth for that region, or that country, or that nation, or what have you. And having done a lot of historical research, and cross-national research, and so on, that seemed to demonstrate there was this connection, it nevertheless occurred to me that - that it would be awfully nice if we could demonstrate, really work this way by trying to create an achieving society, that is, to go into a small community, and teach achievement motivation to some of the business leaders in the community, and then see if subsequently that community developed more rapidly economically than a comparable community where the leaders were not taught achievement motivation. That was the idea. Okay. That's a fairly tall order. That, that, then demands selection of a site, getting access to the site, picking people who are going to do the training, and so on. You see, field research becomes necessary at this point to sort of validate a hypothesis that you developed out of the laboratory, and out of the library. You want to see if the real world-, if things happen in the real world the way they are supposed to happen according to your theory. Well, this was back in the early 1960s, and at that time, there was a lot of interest in helping third world countries, or underdeveloped countries, to develop more rapidly; and always in doing research, you go where the money is. And it looked as if there was some money available to fund projects that would, that had some potential for increasing the rate of economic development in underdeveloped countries overseas, through the Agency for International Development, down in Washington. And we talked to them about it, and it looked as if they were going to give us some money. In the long run, they didn't; that's another story, and that's also typical of applied research. Maybe I should take a minute to explain why we didn't get money from them, because it also illustrates the difficulties of doing field research because the world out there doesn't always understand why you're doing it. We had applied for funds; and they had told us that they were going to give us the funds and we were planning to carry out some of this achievement motivation training in three different sites, one in India, one in southern Italy - a very impoverished part of Italy -, and the third site was in North Africa, in Tunisia. And in all places, we had preliminary plans to start training as soon as the grant money came through. Well, I took a leave of absence from Harvard, rented my house, sold my car, got on my, got on a plane to fly out to India. Before I left, the day before I left, I still didn't have the contract signed, although they had assured me there was no problem, so I called Washington and said, "Look, my flight leaves tomorrow. Shall I go, or shall I not go? I haven't got the money to pay for this trip." And they said, "Oh, go. That's fine. That's going to be signed in no time, and we'll call you in Hawaii and let you know." So I got to Hawaii. They called me and they said, "Sorry, it isn't quite signed yet. It'll be another day. We'll call you in Japan." So they called me in Japan, and they said, "Still not signed; another day. And we'll call you in Hong Kong." They missed me in Hong Kong. By the time I got to India, I got a "Dear John" letter saying, "Dear Dave," from the Director of A.I.D., David Bell, a former colleague of mine at Harvard, saying, "So sorry, but we can't sign your contract after all. It would endanger the whole A.I.D. of research program. It has nothing to do with you, or the quality of your research; it's just that we can't do it for political reasons." And he didn't explain what they were, but I found out years later that what had happened was a left-over from the McCarthy period, that is, Sen. Joseph McCarthy, in the 1950s had been running around finding Communists everywhere in the U.S. government. One of his colleagues had been left in the government as Inspector General in the State department, a man named Mansfield, and he and his colleagues, or operatives, or whatever you call them, had discovered evidence that they though indicated that I was anti-Catholic, because I was talking about The Protestant Ethic and the Rise of Capitalism, and that seemed anti-Catholic to them. And we had a Catholic president for the first time, President Kennedy, so Mansfield led Otto Passman, the Chairman of the House Committee on Foreign Affairs, who was-; they were having great difficulties anyway, with trying to get appropriations for A.I.D., that here was a man - they were about to fund a project which was anti-Catholic and he should stop it. So he did; I didn't get any money. I ended up in India without any money, and no visible means of support. Well I was bailed out a little bit by the Ford Foundation, a little bit the Carnegie Foundation. And we did manage to get a small project, much smaller than we had originally planned going in India, through the Ford Foundation office in India. Again, I was able to do that because the head of it was an old friend of mine, Douglas Ensminger, who had worked for the Ford Foundation there since the days when I had worked for the Ford Foundation. That also tells you something about doing field research: you rely really on a network of friends and former colleagues, or something, because you can't - it's very difficult to begin from scratch. Well, the Indian branch of the Ford Foundation put me in touch with a small industries extension training institution in Hyderabad. It was a government institution, the purpose of which was to promote the development of small businessmen - just exactly what I was looking for. And it happened that an American engineer, by the name of, of… Stapanik, Joe Stapanik, got very sold on my type of training. He thought it was terrific and just what this institution needed, because they had an institute and they didn't know how to train people. So we set up plans to get the institution personnel to train Indian businessmen in achievement motivation through this institute. So we had an institute to work through, and that was through my connections. Well, then the question was, where are we going to do this? Well, India is a big country, and having an experimental background, I still liked the idea of two comparable cities, about the same size. They should be small and relatively distinct. We picked several possible sites throughout India, and we hired somebody to investigate them, to see if there was anything special about them. We picked two cities in Andhra Pradesh in the end to start out the project. In fact, we had plans for three pairs of cities, but we only really completed the first pair and sort of part of the second pair. The first pair were two cities called, Rajahmundry and Kakinada, in Andhra Pradesh, are forty miles apart. Both, one is a river port; the other is a seaport; the same size, the same distribution of working populations; spoke the same language; same cultural group so far as possible, and we like-, we thought of it like a controlled experiment: We're going to inject a certain amount of achievement motivation in the leadership of one town, and not in the other town. And well, of course, that's okay, but then the problem is how do you get access to the people, how do you get them in the tent, so to speak. Well, now, this is an interesting problem because again, social scientists are not used to being salesmen. They typically go around observing other people. Some of my unkind friends say that social scientists and anthropologists are really just Peeping Toms. They just like to sit around and watch what other people do. Well, in this case, we actually had to recruit people, and sell them on the notion that some kind of training was going to be good for them, to get them in the tent, and we had to get key people in the tent. Well, this is obviously the kind of project that is very difficult to do with university personnel in the first place. You can't do this with graduate students. Graduate students are not credible. They may be good field observers, something like that, flies on the wall. But what we had to do, say, in Kakinada is people had to get up in front of the Chamber of Commerce, the Rotary Club, the Lion's Club- they have all those organizations, fortunately, in India- and make speeches, and again, fortunately, in English, selling people on the notion of coming to the institute to get this training. Well, we couldn't use graduate students, couldn't use other faculty, so basically, we had to hire people from the real world who had some experience as consultants in training. And we had to hire Indian personnel - we needed both, obviously. If Indians only had done it, the audience, I think, would not have accepted it as something really new and different. If Americans only had done it, they wouldn't have accepted it because they would have felt that we didn't know anything about Indian context, so we used both types of people. And the long and short of it was we managed to run four training courses and train about fifty people from Kakinada, one town, in Achievement Motivation; and in the other town, we didn't train anybody. However, in both towns, we interviewed a comparable number of people before the training started. And again, this kind of interviewing requires special skills, because it involves some knowledge of business practices because mostly we were interviewing them about their businesses: what business they were in; something about how many people they hired; what was their turnover, their gross sales per year, and stuff like that. We also were told that we couldn't believe their answers because they're used to telling the tax collector what they think the tax collector ought to know, and now what is actually going on/ and we were told that we would not be told accurate figures. We couldn't believe the figures because they weren't going to tell us accurate figures for fear that they would somehow get around to the tax collector. I was told later that this is one of the indirect effects of Mahatma Ghandi's campaign against the British - or the British overlords. He told them all to keep two sets of books, one for the British, and one for themselves. And they kept right on doing it after the British were thrown out. So we had to develop rather elaborate techniques for finding out indirectly what the real picture was, how the business had been doing. And in the end, we had to rely not so much on financial figures as on things like number of people employed, which we figured they wouldn’t lie about very much, although there was a tendency to do that. In many cases, the interviewers would actually insist on checking what they were told. You know, if they said, "Well, we have a go-down." What is a go-down? A shed, I guess, where you'd store goods, materials. The interviewer would ask to go see it because they would invent go-downs that they didn't actually have, in order to sound as if they were more successful. So field work requires that kind of knowledge of the local situation, and what are the customs of the people, how much they're likely to tell you what they think you want to hear versus what is really the case. Well, we interviewed about an equal number of people from both cities before the training, and then we went back every six months for two years after the training to see what was happening in the two cities. We found after half a year, a year, and year and a half, the people that have been trained were beginning to show a much more, many more signs of improved business practices than those who had not been trained, and by the end of the two years, there was no doubt that those that had been trained were doing a much better job, that is, their percent increase in sales was going up much higher. Again, carefully checked so far as possible, by actually looking at the books or observing the number of people employed, what have you. And even so, to this day - this is also important to realize in field research - there are those who don't believe these results because no matter how carefully we checked - and we were extremely careful- people will say that they were telling us what we wanted to hear, that is, the ones who had been trained knew that they were supposed to be performing better, and so they would tell us that they were performing better. And people would say, "You ought to have an independent audit. You should have hired some other organization that doesn't have a stake in the result." We felt that our measures in some cases were so objective that an independent audit wouldn't make all that much difference, I mean, something like number of employees could easily be checked. But nevertheless, that is a problem in field research. Well, a further problem we found was that how did we know that this improved business practice really made any difference for the community as a whole? There were Indian critics - and believe me, there were critics, that's another thing you run into in field research - said, "These crazy Americans are coming over here and making everybody more competitive." And any simple-minded social psychologist knows how bad that is because there is an old laboratory demonstration in social psychology that this Indian psychologist reminded everybody of, which is that you have two people building houses, let's say, in the laboratory, and you put the pieces in the houses in a glass jar with an open top so that you can reach in the jar and pick the pieces to make your house with. And if you then have two subjects, A and B, and ask them to compete to see who will build the house fastest, they both put their hands in the jar at the same time and interfere with each other. And that means neither one gets ahead very fast. So they said, "India is a country of scarce resources. McClelland is coming in here making people compete for scarce resources. That means that everybody will be worse off, not better off. Or, it could be that he's making some people better, but other people will then lose business. Typical example: Here's a guy who sells saris, an Indian garment that women wear mostly. He's all revved up with Achievement Motivation, so he sells more saris, but there's a limited market for saris, so the guy across the street who didn't go to the course, he sells fewer saris. And there's no net gain for the community," they argued. So, we had to do a follow-up study over a period of years to see if community indexes showed gains. And they did. And there's no question now. This study was done in the early 1960s; it's now almost twenty years later. And I think there's no question in anybody's mind that the city that we intervened in has shown a faster rate of industrialization. It isn't, it didn't turn out to be just the people we trained. It created a climate among the leaders in the community so that they went to work to entice other people to invest in their community. It was a declining community, and it's been a growing community since. But there was a long way in getting from our lab hypotheses out to testing in the field. Now, I haven't completely finished the story because there will always be those who'd say that it was chance; it wasn't Achievement Motivation training. It was the fact that you showed a lot more interest in the community than you did the other one, which wasn't quite true because we visited the people in the other community four or five times, too. But we didn't sell them on getting ahead in the same way. So maybe it wasn’t exactly what we taught them. Maybe it was just some special interest that we showed in them. Or maybe it was just chance, you know, maybe the two cities were destined: one to grow faster than the other, and that it didn't have to do with our input. So, it's very difficult to prove, in a field setting, where you don't have everything under control that what you did was what made all the difference. But we try to match things as carefully as we could. Now, the other thing that I want to talk about is that this intervention not only involved dealing with real people in real settings, the whole training itself obviously had to be credible, that is, it had to be built on a tradition that businessmen understand. And it also had to be built on psychological theory. Now, all businessmen know about training courses. They're very training oriented. They're always going off for a short course in selling or accounting or something. To be sure, this is a different kind of course; this is a course in motivation, but everybody thinks that motivation is important, so that's not a problem. But all the evidence that we had up to that time tended in the direction of making psychologists believe that it was very hard to change people, particularly introduce personality change, because there were big debates about the effectiveness even of psychotherapy, where you see a person maybe three times a week for seven years, and if you're damned lucky, you get some change. And there are people that were arguing that you don't get any change. And we obviously couldn't see people three times a week for seven years. We had to pack everything that we were going to do into a very short period of time- the length of time that these men could afford to be away from their businesses. And that meant we had to innovate with what later came to be called "psychological education techniques," and there were much more rational approaches, much more equivalent, I would say, historically to religious retreats, or pep rallies, or something like that, than they were to psychotherapy. We designed a - well, actually, we stated 12 different propositions of ways in which we thought that psychological evidence indicated that you could influence people. Let's say that one of them was, that one way you can influence people to change is through prestige suggestions; it's a well-known phenomenon in the psychological literature. So we said, "Okay, we'll translate that into every use of prestige symbol that we can think of." You know we said, "This is from Harvard"; they had heard of Harvard. "This is science"; science has high value for them. "This is the Ford Foundation," another high value. "This is for your country," another high value, and so on. We tried in every conceivable way to suggest that what we were doing for them was going to help. And then, we taught them about the behavior that they were supposed to change, and used very concrete definitions of the behavior, building on the kind of training that the Skinnerians really had developed- behavior modification. We're very-, their argument is that the more specific the response is that you have to learn, and the more direct feedback that you give in terms of reinforcement, punishment, or reward, the more likely you are to reward it. So we said, "If you want to think like an achiever, you got the scoring system. And you got to be able to produce fantasies that score high in achievement motivation. And you got to find out where you stand on this mysterious motive that we're trying to give to you. But if you're low on it, which most of them were, this is the way you get more of it. This is-, you think like this. You think like a high achiever, then you'll act like a high achiever." And so on. So, we designed a very careful course that had twelve different inputs which we varied systematically. And I must say I learned something here that the application of laboratory methods in the field sometimes leads to some rather peculiar bits of research. We really conceived of this training something like an agricultural experiment, in which you've got some corn out there, and you got 1two different kinds of fertilizers that you're going to put on it. And we designed different courses in India with different combinations of fertilizers to see which combination would give the maximum yield. Our yield measure was a rough index of how much better they were doing after two years. And we put a tremendous amount of energy into planning this kind of research, and to designing courses with and without certain inputs only to discover that the output was a direct function of the size and not the combination of inputs. It's really the total Gestalt, the total picture, the more inputs you have, the more convincing, I guess, the whole picture is. In the book that we wrote on this, I said, "You know, it's much more like putting on a play," and I suppose you could do research which asks the questions, "Is the effect on the audience the same if you switch off the lights." Well, you can have a certain effect without the lights. You can still hear what they're saying. Or if you switch off the sound; or if you put them in costumes or not in costumes. Well, most people will think that is a little bit silly, but that's really what were doing. It's obviously the total combination of lights and sounds, and movement, and gestures, and costumes, and the whole thing; and the more of it you have, the more total impact you have. But that's a case, I would say, of negative transfer from experimentation in a laboratory, where you systematically vary one thing at a time, and do analyses of variants to try to pull out what is the contribution of each input separately. And it really isn't very appropriate, I think, for this kind of research. And we had the analysis of variants model in our mind, and discovered that it really wasn't applicable; it was a combination of elements that worked. Well, what other questions could be asked about field research? I suppose the next question has to do with once you've done it, what then? I don't think it presented any unusual data analysis problems - that is, we analyzed the data the way that we would analyze data from any experiment in terms of differences in yields of the control groups, and the experimental groups. We tried to get reliable measures. We checked the reliability of scoring, records, interview records for how successful the person was-, all the usual methodological devices. But there was a problem, a different problem, which came in writing it up, writing up the results of the research because the question is, "Who is the audience? Who is going to read this?" And there we had a problem because in a sense, this research grew out-, it was done by a psychologist-, it grew out of some psychological theory applied in the real world. But psychologists aren't interested in economic development, or small business performance at all. I discovered, when the book was reviewed, which we wrote on this experience, psychologists reviewed it and he said, "Well, you know, this doesn't have much to say to psychologists." Well, you could say economists; maybe they’re interested. They are interested in economic development but they’re not, you know, when you try to talk to economists about training courses in motivation - that seems really weird to them. The kind of inputs they're talking about are interest rates, capital investments, labor productivity, terms of trade, and stuff like that. So they're not-, it's not in their field. Well, business people are interested because they know something about small business, although even here, we ran into a bit of a problem because we discovered that business schools know very little about small business, and the reason they know very little about small business is that, nobody, no small businessmen are going to pay their professors to study small business. The professors at business schools know about big business because that's where the money is. They get paid to study it. So there is always a course in most business schools in small business given, but very little, sort of, theory, and less research, I would say almost no research of this type on small business was available at the time. And we found some communication problems there because the business people were not then as familiar with the methodological niceties of how you measure things and give tests, statistical significance, and so on. Still another possible audience are planners, government planners. We thought that these findings would be of tremendous interest to people trying to develop countries, because one obvious way to do it, according to this, is to spend more of your money on training small businessmen, because we showed that this is a very economical way, in cost-benefit terms, of increasing employment in the town, where we-. It was a very cheap input, and a much cheaper way of creating additional jobs than capital inputs, which is the alternative model that government was using; and the ratio was much more favorable to train. However, because economists run most planning units, we found that this message made no headway at all. So, in the end, I felt rather frustrated about this research and its reception because it fell between about four different fields. And everybody sort of opted out, saying, "Well, you know, I really don't know about this. I know some part of it. I don’t know the other part of it. I don't know if it would really work even though they claim it does. Maybe they wouldn't get it if they try it again. It doesn't fit in with anybody's rational model of how to promote economic developments." Nothing-, to this date, I think it hasn't had much effect anywhere, although we had strong hopes fifteen or twenty years ago, when we did the research, that it would be the proof that everybody needed, that this was the way to promote economic development. Peter Blanck: If you want to switch gears now, and move away from those questions a bit, and talk about people who you think are doing good field research, or studies that you've been associated with. Bob Rosenthal mentioned briefly- although he didn't describe it- you had done a study in the Navy, with officers, which sounded quite interesting. You might want to talk a bit about that. He really didn't elaborate on it. Dave McClelland: Well, what happened after this was that I had decided, based on this experience, that there was a place in society for consulting organization, which would make use of the latest behavioral science knowledge, psychological, sociological, and so on. Rather than sort of speak just economic knowledge, or seat-of-the-pants wisdom about what makes a good manager. So about this time, what grew out of this was a company, now called McBurn Co., which does-, well, started out doing motivation training, and did quite a bit of it for a while, especially for minorities in this country- Black, Hispanics, and so on, who were out of the business stream. It was quite successful here in demonstrating that it really improved their performance. But got out of it because there was nobody to pay for it. The small businessman can't afford it. Once we demonstrated that it works, the government won't pay for it again. We hoped that the banks might take it over, because it’s obviously should be better in making loans if they train the people they were giving the loans to; but that hasn't caught on. However, once we had a collection of skilled people, with PhDs in the Behavioral Sciences doing this kind of work, all kinds of other opportunities to apply psychological knowledge became available. And the Navy work grew directly out of another question that was thrown at us by the Navy ten or more years ago, where-. They were having race relations problems in the Navy, and they didn't know how really to improve race relations. They had appointed some people called "human relations officers" to mediate disputes, try to improve race relations. But they weren't doing the job, and they were very worried about it. They came to us and they said, "You're psychologists; you're consultants. How do we train these human relations officers so that they would be a better job?" They were just asking really, for some curriculum, and we said, "Well, we can give you some ideas, but we really don't know very much about it. And we think a better way of finding out what to do is to locate some people- there must one or two- who are doing a really good job, and study them carefully, and see what they're doing, see what characteristics they have, as contrasted to the ordinary run of human relations officers. That seemed like a good idea to them. We got a contract, and we studied them carefully by a very intensive interview method that I've described, called Job Competency Interviewing. And we came up with six or seven competencies that the better ones had, that the average ones didn't have. We found them, identified certain methods of measuring those competencies with new or old psychological tests. And then they said, "Well, that's fine. You can identify people who might do a good job, but there are so few of them, and we got so many jobs that it's much more important to us to train them these competencies." So we said, "Okay, we think we can train them these competencies." And so we designed training courses to train them their competencies. And they were very pleased with the results, and said, "The guys that are trained seemed to do a much better job in race relations. If you can do it for the job of human relations officers, which is a relatively unimportant job to us; it's important at that particular moment in time, but what about all of the other jobs we got? Like petty officers, or division officers, or executive officers?" And we said, "Oh, we can do the same thing. Just tell us who the really good ones are and we'll see how they do." Well, we did a very extensive project for them, for every officer, every type of officer in the Navy cross-validated by the Pacific and Atlantic fleets. By this extensive interviewing thing developed training courses for every officer job in the Navy, so that by 1984, or something like that, everybody will have been trained by a McBurn-type course. And they've just conducted an independent evaluation of officer personnel, I just heard yesterday. They want to know if they are really performing better after they've been through our training, and just concluded that they are performing better. They have all kinds of indexes of how successful the officers are in terms of complaints, and AWOL, and stuff like that. They have pretty good evidence that this type of training works. Now in that case, it was a response to a need that they had, but it was adapting a technique on what I would call criterion analysis. It comes out in test theory. It isn't done very often in test theory, but it's certainly known. And it's had tremendous acceptance and wide applicability, not only in the Navy, but other companies, big companies, are doing it now, all the time, for all kinds of jobs. Peter Blanck: Now, are you-? You seem to have a tradition of going from a lab to the field, whereas some people go from the field to the lab. What are your feelings on it? Are different types, or different strategies appropriate for different questions? Dave McClelland: I think you can go either way. It depends how curious you are. I think some of this job competency stuff-. Now we collected a tremendous amount of data on all kinds of different jobs, and I think we're about ready to go back to the laboratory on some of that, because we're feeling rather dissatisfied; we're ending up with a tremendously long list of competencies, and we sort of had the feeling that we ought to do a better job of sorting and defining them. And we may have to do some laboratory research to do that, because it's difficult to do that kind of research, get that kind of precision in the field. Peter Blanck: But it sounds like what you're doing is experimental field work, and this problem— Dave McClelland: Yes, yes. Peter Blanck: Maybe you can talk a little to the difference of experimental field work as if you're describing it to somebody who's supposed to-, a William F. Whyte walking around the North End. Dave McClelland: Yes. Right. We're not obviously anthropologist just watching what's going on, or sociologists. We are actually trying to introduce changes. And the training experiment I described it’s a straight experimental control group type of analysis, and the job competency analysis is, again, a systematic comparison of two groups to see how they differ. It's a straight experimental approach. And then using those differences for various purposes, to define training objectives, or for selection. And that is quite different from just observing what's going on, and trying to tease out what the main factors are in this situation. Psychologist tend always to compare and contrast, I think. Peter Blanck: I had heard you talk about a year ago, about stress and related-, manager stress, maybe, and- I don't remember- I.G.H. or I.G. something? Dave McClelland: Well, yes. We had been doing some work on health, which I guess also involved a different kind of field work. Again, this came out of laboratory research and then was applied to problems in the area of health. Again, I am primarily a motivational psychologist and we ran across some evidence that a certain motivational syndrome or constellation might be bad for your health: high power motivation, high self-control, relatively low affiliation motivation or concern for other people. And that these, well, to oversimplify a bit just to make the case, were people who were chronically in a state of arousal as if ready for fight or flight, under tension or pressure. Something like the people in Type A Behavior in Your Heart, a book that's been popularized by Freedman and Rosenmann. In Type A behavior, people are very hurried and feel under pressure and so on. And the idea they have, and that we have got some evidence for, people with this, what we called the Inhibited Power Motive Syndrome, since they are-. This tension really reflects the fact that they're pouring out more Adrenalin, hormone in the blood stream than other people. And this is what raised them for fight or flight. The heart beats faster, and gets glucose to the muscles, so you're ready to fight or flee, and so on. But one of the side effects of all this adrenalin may be that it interferes with white cell function, or lymphocyte function. And the lymphocytes are cells in the blood stream which defend the body against invasion by bacteria, bacterial or viral invasions. And so we thought that, and that this excess, chronic, sympathetic nervous system activity releasing all the Adrenalin might on the one hand, make people more susceptible to high blood pressure and cardio-vascular disease, and on the other hand, interfere with lymphocyte function, which makes it more likely that they'd get sick from infectious diseases, like cold viruses, and so on. And we found this to be the case. Now, in this case, instead of working with businessmen, Indians in India, we ended up over at the hospital, because that's where the sick people are. And we're collaborating with physicians, and cell biologists, and immunologists, and hormone specialists. Again, putting a team together that can trace the connection between the psychological variables and the physiological variables to the end-state disease process. And we hope again, at having established these connections, we're able to then introduce, to psychological-oriented treatment that will improve the situation. We're probably going to pick some particular disease where the immune system plays a major part and through different types of psychological treatment, obviously, than achievement motivation training. We're going to relax people, and so that they have less Adrenalin flowing through their circulatory system and that their immune system is somehow functioning better as a result. Peter Blanck: Switching gears a little, you were-, sounds like you were trained as an experimental social psychologist. Dave McClelland: No, I was never trained as a social psychologist. Peter Blanck: Well, experimental psychologist. Dave McClelland: I was an experimental psychologist, yes. Peter Blanck: How is it that so much of your work ends up so applied? Is that your major interest, applying some of the things you do? Dave McClelland: Well, I just follow my nose. I don't know that I have any particular applied-. I mean, in the case of the Achieving Society, I had made this whole analysis, which seemed very reasonable to me, and it seemed to me that the final check to see if it was correct was to create an achieving society: that's one way to demonstrate that you're right, you know. I guess I probably have- I do have- a value that knowledge should be applied for human benefit, yes. But I don't know that that has ever been the prime motivation. I think it's been more of my work, that is, I think I've been more one of finding something and then following it all the way out to its logical implications in society. Peter Blanck: Is there any other points you want to address? Are there any other points on there? If you want to— Dave McClelland: Well, I suppose here to ask what kind of skills is necessary, I think it's obvious from what I said— Peter Blanck: Maybe if you could talk about some of the people whom you've known through the years, some of their work; who had real applied value that you’ve admired, or some of your associations with people like that. Another question you might want to address is what things you want to do in the future, and… Dave McClelland: Well, I-. Actually, I think most applied psychology is terrible, because I think that most applied psychologists are not very theoretical. They let themselves be dictated to by the client too much. And they try to do what he clients wants, and they're satisfied when they've done something that the client wants. And I don't think it's very high technology at all. I think it's very low technology stuff, for the most part. Just because a client comes to you with a question doesn't mean you have to do it that way. Peter Blanck: Well, how could you in good faith go into the field and try to help the person adapt to their natural environment and yet at the same time change their natural— Dave McClelland: Well, they put the question wrong because they don't understand is what I would say. And if you limit yourself to the way they put the question-. You know, like they came to us and said, "Give us some suggestions as to how to train HROs, and ninety-nine out of one hundred applied consulting firms would have done precisely that. And that, in my view, would have been irresponsible. They would have just gone to the literature and said, "Well, a little T-group training or something like that; or Bayles-type training couldn't do any harm, and it might do a lot of good." But that's not responsible because that's not really hard evidence that it would do any good at all. It might do a lot of harm. As a matter of fact, they were already applying a kind of T-group approach. Most of the HROs were black; and they had the notion that the best thing was to get together and let it all hang out. You know, just talk about what the problems were. Well, what that meant was that the blacks would come and bitch furiously at how they were mistreated by the white officers. So if the white officers weren't prejudiced when they came in, they sure as hell were when they went out, because they were so sick and tired of sitting there and listening to all this complaint. Well, that's an example, you see, of doing what the client asked. The client only asked for information, and we said, "We won't give it to you, because we think it's irresponsible, until you've done this other analysis first." Peter Blanck: What do you think was the most, in your experience with applied work, what is most difficult issue you face, whether it's ethical issue, or maybe you want to talk a bit about ethics in the field. Dave McClelland: Well, I… I haven't run into any particular ethical problems. I've run into political problems. I mentioned a lot of those. I think the biggest problem is getting your work done because people always want to interfere with you. And there's a lot of hostility from all sorts of sources. You're stepping on toes, particularly if you're in a disciplinary, as I said before, we're sort of merging into a field the economists think is theirs. And they just don't understand what we're doing, and they resent it; and I mean, when we talk about motivation training, they say, "We can understand that you should train somebody in accounting. Obviously, they got to keep books, but what is all this motivation nonsense?" That's the problem, much more. Getting the acceptance of the notion that psychology is real hard stuff; that makes a very practical difference in outputs. It's very, very difficult. I haven't, I mean, I suppose there are ethical problems. There are some ethical problems inevitably that arise over confidentiality. They have not been major problems. We-, the ethical problem is, whenever you intervene, are you doing any harm to people? You don't know. The question that that social psychologists in India raised, "Are you really making things worse in India, instead of better? You think you're making it better." I was very sensitive to that, so I was glad to find that the community as a whole had benefited, and not just a few individuals at the expense of others. But you're always going to be challenged if you ever try to do anything important towards changing people by somebody. Peter Blanck: How do you-? Another issue that we're trying to touch upon is, obviously this film is an educational aide- what are some of the techniques you have used in training some of your graduate students to be sensitive to these issues, to get involved in applied work with a good theoretical base? Dave McClelland: I haven't. Mostly, I don't-. Here they don't really get well trained in fieldwork. They get training on the job. They get trained here in straight methodological stuff. I mean, obviously, I talk about it, but there's nothing that I would call real field training in this department. Peter Blanck: Where is there real training? Dave McClelland: Ah, there isn't much anywhere. We used to do a little bit more in the old days here when it was a Social Relations Department, but now that we're a Psychology Department, I think there is even less. People just sort of pick it up, you know. It's obvious what's important: I mean it's things like entrepreneurial skills. You got to get out there and if something fails, you got to be able to do something else; planning skills, I mean, is a major job of figuring out where you're going to do this; interpersonal skills, you got to get along with people, they aren't going to do what you ask them to do; managerial skills. It's all those things, and this department doesn't teach that-those things. Now, we have courses, paradoxically. People say they want to come study with me, I say, "Don’t do that. You won't learn anything from me because I'm not teaching that stuff. Take a course at McBurn Co. Take a public seminar in managing motivation. You'll learn more from that in five days than you will in a whole year here because there, there are specific experiences that are designed to develop certain skills." I'm not sure it's appropriate for a graduate school of Arts and Sciences to give that kind of training. I may be appropriate for business schools. They aren't doing it much. We have actually started a what we-, some people would like think as a major competitor of business schools because Jim Hayes, the Pres. of American Management Associations, thinks that business schools are pretty academic. They're training people in economics, politics, and all kinds of things that don't help them do their job better at all. So he believes in competency-based training. We set up a new Masters program with the AMA, of which we were the consultants. We located all these different competencies that people should have if they're going to be more successful managers. So he set up a competency-based Masters program. When you demonstrate that you can actually display 1two out of 1six of these competencies or whatever, you get you Masters Degree. Well, that's very different from taking a course in economic theory, and passing it. Maybe it's too much like a trade school. I don't think the business schools are going to like this a bit. But it's a different way of looking at it. It's saying, you know, if you're going to be a carpenter, you've got to know how to saw straight. And there's no evidence that anybody gets out of business school, you know, is better at decision making or something like that. You just studied a lot of business decisions being made, but that's rather indirect evidence that he's any better at it himself. That’s interesting, and maybe the business schools will introduce some aspects of competency training. Peter Blanck: What is the future for you? What are you going to be studying Dave McClelland: Well, I never know. I'm right now working in the health field, and I have a new grant for working in that area. I think beneath all that is my interest in the fact that these motives, that I've spent all my life studying, may have some rather unique biological basis that I want to investigate. In other words, I spent a lot of my career seeing what the social aspects are; now, I want to look at their biological aspects. But that's-. One can only predict two or three years ahead. I mean six years ago, I would never have predicted that I'd be in physiology, or psychophysiology today. Peter Blanck: Well, one final question, which we end most interviews with, is just do you find- you touched upon it briefly, but if you want to give a summary statement- most satisfying about the research you've done, the type of contribution you've made. Or what do you see as the type of contribution you've made to society? Dave McClelland: Well, I… I think it's both theoretical and practical. And I think anything that is good theoretically has practical applications. I mean, we talk about theoretical physics, but if physics didn't work, in terms of its engineering applications, physics wouldn't have nearly as, wouldn't nearly be as important in my view. So I think that what I have done is to try to marry the two, marry theoretical developments with their practical and social implications, or in the present work, their biological implications. And I think not everybody can do that as easily. I mean, in some areas of psychology it’s quite easy; in other areas, it's quite difficult. The fit is very clear, let's say, if you’re studying vision, too. I mean, the applications of pure theory and vision are very real, even if it comes to what kind of glasses people are going to wear so that they can see better. I think, I think the marriage of the two is very important, and I would, I would say that on balance, psychologists don't do enough field work. They get so caught up in these fine-tuning- little variables in the laboratory, and arguing with each other about how much of this or how much of that produces this effect, that they-, it’s very discouraging to go back and read psychological journals, as I’ve been doing recently for the last twenty or thirty years, and realize how many studies arguing about rather fine points there are when they might better be considering some of the lab, applications of some of these ideas in the field, because they’d find-if they went to the field- that a lot of the things that they’re talking about just aren’t important out there anyway. They got to be important only because one lab did it this way, and another lab did it a little differently. So it’s kind of technique-generated research. Master's Series on Field Research 16