Asking Questions: Techniques for Semistructured Interviews Author(s): Beth L. Leech Source: PS: Political Science and Politics , Dec., 2002, Vol. 35, No. 4 (Dec., 2002), pp. 665- 668 Published by: American Political Science Association Stable URL: https://www.jstor.org/stable/1554805 JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org. Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at https://about.jstor.org/terms American Political Science Association is collaborating with JSTOR to digitize, preserve and extend access to PS: Political Science and Politics This content downloaded from 134.117.10.200 on Fri, 03 Sep 2021 16:09:25 UTC All use subject to https://about.jstor.org/terms Asking Questions: Techniques for Semistructured Interviews n an interview, what you already know is as important as what you want to know. What you want to know determines which questions you will ask. What you already know will determine how you ask them. Thanks to past jobs as a journalist and as an anthropological researcher, I've had training in both journalistic and ethnographic styles of interviewing. The two are at the opposite ends of the interview continuum. The journalistic style tries to verbally pin the respondent down by appearing to know everything already. The questions are direct and directed toward a particular outcome. The ethnographic style of interviewing instead tries to enter into the world of the respondent by appearing to know very little. There are many types of interviews with many styles of questions, each appropriate in different circumstances. Unstructured interviews, often used by ethnographers, are really more conversations than interviews, by with even the topic Beth L. B |Ieech, of conversation subBeth L. Leech, ject to change as Rutgers University the interview progresses. These "soaking and poking" experiences are most appropriate when the interviewer has limited knowledge about a topic or wants an insider perspective. But the tendency for such interviews to wander off in unexpected directions-although they may provide for fresh ideas-almost guarantees that the interviews will not be a very consistent source of reliable data that can be compared across interviews. Unstructured interviews are best used as a source of insight, not for hypothesis testing. Sometimes, however, we already have a lot of knowledge about a topic and want very specific answers to very specific questions. When the researcher already knows a lot about the subject matter-the categories and all possible responses are familiar, and the only goal is to count how many people fall into each category of response-structured interviews with closed-ended questions are most appropriate. Political scientists are most familiar with this type of interview because of mass public opinion surveys. Such closed-ended approaches can sometimes backfire, however, if we assume we are familiar with an area but end up asking the wrong questions in the wrong way or omitting an important response choice. We may find ourselves with reliable data that lacks any content validity. There is a middle ground, however, and one that can provide detail, depth, and an insider's perspective, while at the same time allowing hypothesis testing and the quantitative analysis of interview responses. In this essay I will focus on that interview style-semistructured interviews with open-ended questions. It is a style that is often used in elite interviewing, and variations on this style are discussed in several of the other essays on these pages. My observations and suggestions here come not only from my past experiences as a journalist and an ethnographic researcher, but also from my current research among lobbyists and policymakers in Washington, DC, as part of the Advocacy and Public Policymaking project.' Gaining Rapport Without rapport, even the best-phrased questions can fall flat and elicit brief, uninformative answers. Rapport means more than just putting people at ease. It means convincing people that you are listening, that you understand and are interested in what they are talking about, and that they should continue talking. There are several ways of doing this within the interview. Putting Respondents at Ease Some interviewing textbooks recommend that the interviewer "appear slightly dim and agreeable" (McCracken 1988, 38) or "play dumb" so that respondents do not feel threatened and are not worried that they will lose face in the interview. The danger here is that-especially when dealing with highly educated, highly placed respondents-they will feel that they are wasting their time with an idiot, or at least will dumb-down their answers and subject interviewers to a Politics 101 lecture. At the same time, the concern about respondents' feelings is valid. Even highly educated, highly placed respondents do not want to appear stupid in front of a university professor (I have had to reassure vice presidents of large organizations who were worried that they had "babbled" during an interview). I recommend a middle road. The interviewer should seem professional and generally knowledgeable, but less knowledgeable than the respondent on the particular topic of the interview. So for me, I know a lot about lobbying and a lot about American politics, and I know what has been in the newspaper on a given policy issue, but I present myself as having little or no idea about what happened behind the scenes in the given policy issue I am interviewing about. I try to continue this PSOnline www.apsanet.org 665 This content downloaded from 134.117.10.200 on Fri, 03 Sep 2021 16:09:25 UTC All use subject to https://about.jstor.org/terms approach even after I have conducted many interviews on the same policy issue. I don't want someone to leave something out because they assume I already knew it. A second, related point to remember is that many of your subjects will be more nervous than you are. After all, after you've done a couple of these, you are experienced. Your respondents, however, may never have been the subject of an academic study before. Reassure them by being open and avoiding threatening descriptions of your work. "Talk with you" is less threatening than "interview you," for example (Weinberg 1996, 83). It is possible to be honest without being scary. There's no need to make your work sound like a medical procedure (at least until you are getting ready to submit it to a journal, that is). Approach interview subjects with a positive attitude. Act as though it is natural that people would want to talk to you. Appear friendly and curious. An important way to make an interview subject feel at ease is to explain your project again. This is the one-minute version of your project, which should describe the topic you are interested in and the types of questions you will ask, without tipping your hand as to your hypotheses.2 At this point you can remind respondents that their answers are confidential.3 Are You Listening? During the interview itself, before moving on to the next question it often helps to briefly restate what the respondent has just said. (This should take no more than a sentence.) This shows that you are interested and have understood what the respondent has just said. It also provides a chance for the respondent to correct you if you have misunderstood. Avoid reinterpreting what the respondent has just said, as this has the tendency to work against rapport and leave the respondents feeling as if the interviewer is trying to put words in their mouths. Use the respondents' own language, if possible, to summarize what has just been said. Anthropologist James Spradley suggests that when an interviewer does not understand a particular point, that it is better to ask for use rather than meaning (1979, 82). That is, "When would you do that?" or "What would you use that for?" are usually better questions for building rapport than "What do you mean by that?" The latter tends to shift respondents out of their own verbal word and to begin speaking to you as an outsider. Question Order Question order is important for substantive reasons (order affects occur in interviews, just as they do in surveys), but order is also important as a means of gaining rapport. As any journalist would tell you, in an interview you should always move from the nonthreatening to the threatening (Weinberg 1996, 85). That is, ask the easy questions first. In the interviews I have conducted among lobbyists and policymakers, I find that it usually works better to ask things like age, background, title, and other personal things last. That way the interview doesn't come off as if it is about my respondent personally, but rather about the political issue or organization that we are talking about. This type of question order works for me because my other questions are not personal, and are therefore even less threatening than the demographic information I collect at the end. On the other hand, if your interview questions focus on an individual's own political and philosophical beliefs, then obviously questions about education, background, and title would be less threatening and would provide a good place to start. When and how should you ask sensitive questions? It's usually best to wait until the middle or toward the end of the interview. Don't wait until the last minute-you may run out of time. Don't hem, haw, or make it seem as though any normal person would refuse to answer this question. Just ask. Then be quiet, and give the respondent time to answer. Most people will try to fill the silence, and you will get your answer. A second thing to remember about sensitive questions-or any question, for that matter-is to use nonjudgmental, nonthreatening wording. For instance, asking a respondent, "What kinds of help do you give to members of Congress as they are going about their work or daily lives?" is likely to gain you more information than if you were to ask, "Do you do favors for members of Congress?"4 Likewise, I know that nonprofit organizations with 501(c)3 charitable status are skittish about the word "lobbying," since the IRS restricts the amount of lobbying they can do. So I make a habit of referring to "advocacy efforts" or "policy work" instead. When Did You Stop Beating Your Wife? What are known as "presuming" questions are common in journalism, but are usually not good social science. There are circumstances, however, when such questions are necessary to make respondents comfortable enough to answer honestly. When the question is one that the respondent is likely to try to avoid and involves a matter that may have a stigma attached to it, a presuming question may be the only way to go. When I was working as an ethnographic researcher in Kenya, collecting reproductive histories from women, I first began simply by asking women to tell me about all of their pregnancies. It was clear from the first few interviews that no one was mentioning miscarriages, stillbirths, or deaths of children-and I knew that could not be accurate in a rural area with nonexistent prenatal care and high child mortality. So I tried probing: "Tell me about any children who died." I used this question only once, and it caused a respondent to jump up, mutter that she must go check on the goats, and run out the door. After some help from a language consultant, I did two things. I made my language less threatening, and I asked the question in a presuming way. "How many children are the lost ones?" I asked-"Aja inkera netala?" My respondents' faces would turn serious, they would sigh, then they would tell me the details I was seeking. To return to the political world, instead of asking a lobbyist, "Did you give soft money donations?" it might make the question easier to answer to say, "How much did your organization give in soft money donations?" The latter presumes that it is normal to give soft money donations and that everyone must do it, and also shifts the onus away from the individual and onto the organization. (Actually, I should point out here that you should never ask for information in an interview that you could collect elsewhere, unless you are using the question to double-check the veracity and accuracy of a respondent. Asking for information you could easily collect elsewhere wastes precious interview time and risks insulting your respondents, since you are essentially asking them to do your homework for you.) Presuming questions are presuming in the sense that they imply that the researcher already knows the answer-or at least part of it. So one danger is that the respondent will bluff to save face and make something up. That is why I suggest such questions should be used very sparingly and only when they are needed to take the edge off of questions that may otherwise have a stigma attached. In my examples above, a respondent would be relieved, not shamed, to be able to say "None." PS December 2002666 This content downloaded from 134.117.10.200 on Fri, 03 Sep 2021 16:09:25 UT6 12:34:56 UTC All use subject to https://about.jstor.org/terms Types of Questions We all know that there are certain types of questions to avoid-loaded questions, double-barreled questions, leading questions, and (usually) presuming questions. But what types of questions should you ask in an open-ended, semistructured interview? Grand Tour Questions The single best question I know of for a semistructured interview is what Spradley (1979) calls a grand tour question. Like the name suggests, these questions ask respondents to give a verbal tour of something they know well. The major benefit of the question is that it gets respondents talking, but in a fairly focused way. Many good interviewers use this type of question instinctively. Jeff Berry, for example, used one in the research for his book The Interest Group Society when he asked lobbyists to describe an average day (1997, 94). There are many different types of grand tour questions (see Spradley 1979, 86-88). The most common is probably the typical grand tour question: "Could you describe a typical day in your office?" "Could you describe a typical day on the Hill?" "Could you describe a typical day in a member of Parliament's office?" Such questions have the benefit of giving you a sense of what an average day is like, but the drawback that you are not certain what is being averaged-that is, how much variation there is and how accurate the respondent's sense of the usual really is. Respondents may have a tendency to focus on the interesting (which may not be usual), or on what they think should happen day to day (although it actually may not). If you are doing enough interviews to get a sense of the average by comparing across interviews, then you may want to turn to a specific grand tour question. Specific grand tour questions ask for a tour based on some parameter decided by the interviewer-a day, a topic, an event: "Could you walk me through what you did yesterday in your office?" or "Walk me through what your organization did in response to issue X." We used a specific grand tour question to begin our interviews for the Advocacy and Public Policymaking project, asking respondents to describe their organizations' activities on the most recent policy issue in which they were involved. Not all interviews need to be conducted sitting down, and not all grand tours need to be virtual. A guided grand tour is an actual tour: "The next time you are lobbying on the Hill, could you bring me along and show me what you do?" Related to this are task-related grand tours. Such questions ask the respondent to perform some usual task while verbally walking the interviewer through the task. For instance, I could ask a lobbyist to lay out talking points for a meeting with a legislator, or to compile a list of which members of Congress to talk to, explaining the decisions being made at each step of the process. Example Questions Example questions are similar to grand tour questions, but still more specific (see Spradley 1979, 87-88). They take some single act or event identified by the respondent and ask for an example: "Can you give me an example of a time that you used grassroots lobbying?" A related type of question is native language questions, which ask for an example in the respondent's own words. These can be direct-language questions-"How do you refer to these lobbying activities? What do you call them?"-or hypothetical interaction questions-"If you were talking to another lobbyist, what would you call that?" or "If I were to sit in on that meeting, how would I hear people referring to that?" Hypothetical interaction questions are sometimes easier to answer than direct language questions, because they help put the respondent in the mindset of talking to other experts, and can help shake them out of Politics 101. Ethnographers use many other types of questions, many of which are of diminishing usefulness for most political scientists. However, the less you knew about an area, the more important such questions would become, to add direction to what otherwise would be a random conversational walk. Structural questions, for example, ask respondents to semantically structure their world through such exercises as listing all the different types of something and how they relate to each other (Spradley 1979; Werner and Schoepfle 1987). So, hypothetically, if I did not already know the different ways in which interest groups can lobby, instead of simply asking "What has your organization done in relation to this issue?"-I could ask something like this: "We've been talking about your advocacy efforts on this issue and you have mentioned that you sent a letter to members on the committee, visited with members of the congressional delegation from your district, and put information on your website. Now I want to ask you a slightly different kind of question. I'm interested in getting a list of all the different types of advocacy activities your organization has undertaken in relation to this issue. This might take a little time, but I'd like to know all the different types and what you would call them. (Adapted from Spradley 1979, 122) Note that the second question would get you a lot more information than the first. It starts off by showing that the interviewer has been listening, then asks for more information in a specific way.5 Be aware that if you really want a complete list then you may need to repeat the last part of this question many times to get all of them: "And are there any other types of advocacy efforts your group uses?" This is an example of a prompt, and leads me into my final type of question. Prompts Prompts are as important as the questions themselves in semistructured interviews. Prompts do two things: they keep people talking and they rescue you when responses turn to mush. Let's take the introductory question from the Advocacy and Public Policymaking project: "Could you take the most recent issue you've been spending time on and describe what you're trying to accomplish on this issue and what type of action are you taking to make that happen?" One of my respondents answered, "Well, we've been talking to some people on the Hill and trying to get our message out." He had just described the activities of every lobbyist in Washington. If I had stopped here, the interview would have been useless. Luckily, my interview protocol included numerous prompts, based on what we wanted to be able to code from this question, including who the targets of lobbying were, and what lobbying tactics were used. So at this point, possible prompts would include: "Who have you been talking to on the Hill?" and "What are you doing to try to get your message out?" PSOnline www.apsanet.org 667 This content downloaded from 134.117.10.200 on Fri, 03 Sep 2021 16:09:25 UTC All use subject to https://about.jstor.org/terms McCracken (1988, 35-36) identifies several different types of prompts. Prompts like the ones I just mentioned are planned prompts-prompts that are formally included in the interview protocol. At the end of each formal question we ask as part of the Advocacy and Public Policymaking project, there is an italicized list of specifics that the interviewer is supposed to probe for if the respondent doesn't bring them up. For example: probe about coalition partners (formal or informal) probe about who they are speaking with about this issue One difference between a prompt and a question is that the prompts are not scripted as are the initial questions. The reason is that every interview is different and the list of possible probe situations could potentially go on for dozens of pages. That makes it important for the interviewer to have a plan for how the interviews will eventually be coded, so that the interviewer can make sure that the responses have covered the necessary points. Probably the most instinctive type of prompt is an informal prompt. This is an unscripted prompt that may be nothing more than the reassuring noises and interjections that people make during any conversation to show that they are listening and interested: "Uh-huh." "Yes." "How interesting." But the welltrained interviewer has a variety of informal prompts to use. Floating prompts, for example, are used to clarify (McCracken 1988, 35). These may be nothing more than raising an eyebrow and cocking one's head, or they may be specific questions: "How?" "Why?" and "And then...?" One way to ask for clarification and at the same time build rapport is to repeat the key term of the respondent's last remark as a question: Respondent: "And the bill was completely whitewashed in committee." Interviewer: "Whitewashed?" McCracken warns against leading respondents by putting words in their mouths ("Do you mean the bill was gutted?") You risk losing rapport or having the respondent go along with your definition ("oh, yeah, sort of'), rather than clarifying further. The goal here is to listen for key terms and to prompt the respondent to say more about them. Notes 1. My collaborators on the Advocacy and Public Policymaking project are Frank R. Baumgartner, Marie Hojnacki, David C. Kimball, and Jeffrey M. Berry. Research has been supported by National Science Foundation grants SBR-9905195 and SES-0111224. For more information on this project, including the complete interview protocol, see our website at . Also see Leech et al. 2002. 2. Elite interviewing subjects often are quite savvy about social science research, and it is not uncommon for an interviewee to ask, "So what is your working hypothesis here?" I respond to questions like these by explaining that if I told them I would risk biasing my results, but that I would be happy to send them information about the project and its hypotheses after the interview is over. References Berry, Jeffrey M. 1997. The Interest Group Society. Third ed. New York: Longman. Leech, Beth L., Frank R. Baumgartner, Jeffrey M. Berry, Marie Hojnacki, and David C. Kimball. 2002. "Organized Interests and Issue Definition in Policy Debates." In Interest Group Politics, eds. Allan J. Cigler and Burdett A. Loomis. Washington, DC: CQ Press. McCracken, Grant. 1988. The Long Interview. Newbury Park, CA: Sage. Enough is Enough One of the most important rules about asking questions has to do with shutting up. Give your respondent room to talk. If respondents get off topic, let them finish, then bring them gently back to the issue you are interested in. But don't try to control too much or you may miss important, unexpected points. Conclusion Used in combination, grand tour questions and floating prompts are sometimes enough to elicit almost all of the information you need in a semistructured interview (with planned prompts ready in case the floating prompts don't work!). I know that in many of my interviews for the Advocacy and Public Policymaking project, the answer to the first grand tour question took up half of the interview hour-and rendered many of the subsequent questions on the protocol virtually unnecessary. I would, of course, check, "You have mentioned x and x as people you worked with on this issue. Was anyone else involved in this issue?" But often the answer was no and we would quickly move on to the next question on the protocol. This was the best of both worlds, because it collected the information we wanted and provided it in the respondent's own language and framework. Some of the question styles that semistructured interviewing borrows from anthropology may seem not very useful if you seek very specific information about a known topic and are not planning to write an ethnography of lobbyists, elected officials, or civil servants. On the other hand, if you take the time to ask these kinds of questions, you sometimes get surprising answers and learn something new. It's true that the type of interview you use depends on what you already know, but if you already knew everything, there would be little reason to spend time in a face-to-face interview. Semistructured interviews allow respondents the chance to be the experts and to inform the research. 3. An excellent way to convince your respondents that you really are serious about confidentiality issues is to decline to give them any information about the people you already have interviewed. A respondent may ask, "So who else have you talked to?" The interviewer can answer, "Oh, several people, although I can't reveal exactly who without their permission." 4. These questions also raise an elementary point about interviewing: Don't ask a yes-or-no question unless you want a yes-or-no answer. "How," "why," "what kinds of," and "in what way" usually are much better ways to begin a question in a semistructured interview. 5. This question also demonstrates that expanding the length of the question tends to expand the length of the response (Spradley 1979, 85). Be aware, however, that long questions can lead people off point or confuse them. If you want a specific answer, ask a specific question. Spradley, James P. 1979. The Ethnographic Interview. New York: Holt, Rinehart and Winston. Weinberg, Steve. 1996. The Reporter's Handbook: An Investigator's Guide to Documents and Techniques. Third ed. New York: St. Martin's. Werner, Oswald, and G. Mark Schoepfle. 1987. Systematic Fieldwork: Foundations of Ethnography and Interviewing. Vol. 1. Newbury Park, CA: Sage. PS December 2002668 This content downloaded from 134.117.10.200 on Fri, 03 Sep 2021 16:09:25 UTC All use subject to https://about.jstor.org/terms