! ™ ľ *nn 11 Developing Data Analysis Chapter 10 stressed ehe importance of early data analysis and showed how ' to kick-start such analysis. In this chapter, we will examine how you can .. develop your research after these beginnings. Although we will iocus here just on observational and tape-recorded data, many of the suggestions equally apply to other kinds of qualitative data. However, a checklist or 'suggestions' can appear somewhat anaemic and without substance. This chapter begins, therefore, with an account of how data analysis developed in one qualitative study. A CASE STUDY: OBSERVING HEART CLINICS In the early 1980s (see Silverman, 1987: Chs 1-6) 1 was directing a group of researchers {Studying a paediatric cardiology (child heart) unit. Much of our data derived from tape-recordings or an outpatient clinic that was held every Wednesday. It was not a coincidence that we decided to focus on this clinic rather than upon, say, interaction on the wards. Pragmatically, we knew that the clinic, as a scheduled and focused event, lasting between two and four hours and tied to particular outcomes, would be likely to give us a body of good quality data- By contrast, on the ward, tape-recording would be much more intrusive and produce tapes of poorer quality because of multiple conversations and background noise- Even if these technical problems could be overcome, the (apparently) unfocused character of ward life meant that it would be far harder to see order than in the outpatient clinic. For instance, unlike the latter, there would be no obvious repetitive structures like scheduled meetings by appointment, physical examinations and announcements of diagnosis and prognosis. Of course, this does not mean that a researcher should never study apparently unfocused encounters - hrom the hospital ward to the street comer. But it does mean that, if you do, you must be prepared for long vigils and.apparently unpromising data before researchable ideas start to gel- At our hospital clinic, we became interested in how decisions (or 'disposals') were organized and announced, ft seemed likely that the doctor's way of announcing decisions was systematically related not only to clinical rrrr.r.r.r.r.r; DEVELOPING DATA ANALYSIS E »ít-factors (like the child's heart condition) but to social factors (such as what parents would be told at various stages of treatment). For instance, at a first outpatients' consultation, doctors would not normally announce to parents the discovery of a major heart abnormality and the necessity for life-threatening surgery. Instead, they would suggest the need for more tests and only jiint that major surgery might be needed. They would also collaborate with parents who produced examples of their child's apparent 'wellness'. This step-by-step method of information giving was avoided in only two cases. If a child was diagnosed as "healthy' by the cardiologist, the doctor would give all the information in one go and would engage in what we called a 'search and destroy' operation, based on eliciting any remaining worries of the parent(s) and proving that they were mistaken. In the case of a group of children with the additional handicap of Down's syndrome, as well as suspected cardiac disease, the doctor would present all the clinical information at one sitting, avoiding a step-by-step method. Moreover, alypically, the doctor would allow parents to make the choice about further treatment, while encouraging them to dwell on non-clinical matters like their child's 'enjoyment of life' or friendly personality. We then narrowed our focus to examine how doctors talked to parents about the decision to have a small diagnostic test on their children. In most cases, the doctor would say something like: What we propose to do, if you agree, is a small test. No parent disagreed with an offer which appeared to be purely formal - like the formal right (never exercised) of the Queen not to sign legislation passed by the British Parliament. For Down's syndrome children, however, the parents' right to choose was far from formal. The doctor would say things to them Like the following: I think what we would do now depends a little bit on parents' feelings. Now it depends a little bit on what you think. H depends very much on your own personal views as to whether we should proceed. Moreover, these consultations were longer and apparently more democratic than elsewhere. A view of the patient in a family context was encouraged and parents were given every opportunity to voice their concerns and to participate in decision-making. In this subsample, unlike the larger sample, when given a real choice, parents refused die test - with only one exception. Yet this served to reinforce rather than to challenge the medical policy in the unit concerned. This policy was to discourage surgery, all things being equal, on such children. So the democratic form coexisted with (and was indeed sustained by) the maintenance of an autocratic policy. 139 r-rrrrrrrr PART THBtr ANALYSING YOU» DATA TAWl 11 1 Four woy» lo devrlop dole aoolysii • Focui on d o( high quality anil ON miiioil to collocr (iapo-r«)CO(dln(|i ol dlnici| - foeui o<> o»« pfootu within ihoio data |how miidlcal 'diipowlť oHJ'o'rganjzodl - Narrow down lo one pari ol rfioř prooMi (anncn/nclnp, a imall diognoih-; teil) • Compoío rn. (Coffey and Atkinson, 1996: 46) As Miles and Huberman (1984) point out, qualitative data come in the form oi words rather than in numbers. The Issue, then, is how we move from these 142 ■nrrrnnn DEVELOPING DATA ANALYSIS 'J'»' H/ords to data analysis. They suggest that data analysis consists of 'three Kfonaii'retil flows of activity: data reduction, data display and conclusion Efrawing/verification' (1934: 21): ■ . Data reduction 'refers to tine process of selecting, focusing, simplifying, abstracting, and transforming ... "raw" data' (1984: 21). Data reduction involves making decisions about which data chunks will provide your initial focus. C * Drtřrt display is 'an organized assembly of information that permits conclusion drawing and action taking' (1984: 21). It involves assembling your dala into displays such as matrices, graphs, networks and charts which clarify the main direction (and missing links) of your analysis. . Conclusion drawing means 'beginning to decide what tilings mean, noting regularities, patterns, explanations, possible configurations, causal flows and propositions' {1984: 22). 8 • Verification means testing our provisional conclusions for 'their plausibility, their sturdiness, their "confirmability" - that is, their validity' (1984: . Miles and Huberman demonstrate that in field studies, unlike much quanti-f. lalive research, we are not satisfied with a simple coding of data. As I argued c in Chapter 3, this means that qualitative researchers have to show how the (theoretically defined) elements that they have identified arc assembled or £ mutually laminated. The distinctive contribution qualitative research can l make is by utilizing its theoretical resources in the deep analysis of small I bodies of publicly shareable data. This means that coding your data according to some theoretical scheme should only be the first stage of your data analysis. Vou will then need to go on to examine how these elements are linked together. At this second stage-lateral thinking can help. For instance, you can attempt to give your chosen concept or issue a new twist, perhaps by pursuing a counter-intuitive idea or I by noting an additional feature little addressed in the literature. In any event, 1 as I show below, one way of achieving better data analysis is by a steadily more narrow focus. Progressive focusing in fieldwork We only come to look at things in certain ways because we have adopted, either tacitly or explicitly, certain ways of seeing- This means that, in observational research, data collection, hypothesis construction and theory building are not three separate things but are interwoven with one another. This process is well described by using an analogy with a funnel: Etlinographic research has a characteristic 'funnel' structure, being progressively focused over lis course. Progressive focusing has Iwo analytically distinct component. First, over time the research problem is developed or transformed, and i4a PART THREE • ANALYSING YOUR DATA eventually its scope is clarified and delimited and its internal structure explored. In litis sense, it ig .frequently only over die course of the research that one discovers what the research is reaUv 'abouť, and it is not uncoööhon for il to turn out to be about something quite remote from the initially foreshadowed problem^ (Hamrnersley and Atkinson, 1983:175) Atkinson (1992) gives an example of such a redefinition of a research problem. Many years after completing his PhD, Atkinson returned to his Original fieldnotes on medical education. He shows how the original data can be reread in a quite different way. AUdnson's earlier method had been to fragment his fieldnotes into relatively small segments- each with its own category. For instance, a surgeon's description of post-operative compli-» cations to a surgical team was originally categorized under such headings * as 'unpredictability', 'uncertainty', 'patient career' and 'trajectory'. When * Atkinson returns to it, it becomes an overall narrative which sets up an enigma ('unexpected complications') which is resolved in the form of a 'moral tale' ('beware, unexpected things can service and state-provided medicine (Silverman, 1981; 1987). These two cases had three features in common: 1 The switch of focus - through the 'funnel' - as a more defined topic arose. 2 The use of the comparative method as an invaluable tool of theory building and testing. 3 The generation of topics with a scope outside the substantive area of the research. Thus the 'ceremonial orders' found in the cancer clinics are not confined lo medicine, while the 'democratic' decision-making found with the Down's syndrome children had unexpected effects of power with a significance far beyond medical encounters. A$ I have noted elsewhere {Silverman, 1993), working this way parallels Glaser and Strauss's (1967) famous account of grounded theory. A simplified model of this involves these stages; • an initial attempt to develop categories which illuminate the data • an attempt to 'saturate' these categories with many appropriate cases in order to demonstrate their relevance • developing these categories into more general analytic frameworks with relevance outside the setting- 144 DEVELOPING DATA ANALYSIS vy Glaser and Strauss use their research on death and dying as an example. They" ihow how they developed the category of 'awareness contexts' to refer to the kinds of situations in which people were informed of their likely fate. The category was u'ien saturated and finally related to non-medical settings wheie people learn about how others define them (e.g. schools). 'Grounded theory' has been criticized for its failure to acknowledge £ implicit theories which guide work at an early stage. It is also clearer about t the generation of theories than about their test. Used unintelligent!)', it can t -jjo degenerate into a fairly empty building of categories or into a mere f smokescreen used to legitimize purely empiricist research (see my critique of fc four qualitative studies in Chapter 26; and see Bryman, 1988:83-7). At best, I 'grounded theory" offers an approximation of the creative activity of theory building found in good observational work, compared with the dire abstracted empiricism present in the most wooden statistical studies. However, quantification should not be seen as the enemy of good field : research. In the next section, I discuss one example of how simple tabulations ' were used to test an emergent hypothesis in the study of cancer clinics. Using tabulations in testing fieldwork hypotheses In the cancer study, I used a coding form which enabled me to collate a number of crude measures of doctor and patient interactions (Silverman, 1984). The aim was to demonstrate that the qualitative analysis was reasonably representative of the data as a whole. Occasionally, the figures revealed that the reality was not in line with my overall impressions. Consequently, the analysis was tightened and the characterizations or clinic behaviour were specified more carefully. The crude quantitative data I had recorded did not allow any real test of the major thrusl of this argument. Nonetheless, they did offer a siurunary measure of the characteristics of the total sample which allowed closer specification of features of private and NHS clinics. In order to illustrate this, let me briefly show you the kind of quantitative data 1 gathered on topics like consultation length, patient participation and the scope of the consultation. My overall impression was that private consultations lasted considerably longer than those held in the NHS clinics. When examined, tlie data indeed did show that the former were almost twice as long as the latter (20 minutes as against 11 minutes) and that the difference was statistically highly significant. However, I recalled that for special reasons, one or the NHS clinics had abnormally short consultations. I felt a fairer comparison of consultations in the two sectors should exclude this clinic and should only compare consultations taken by a single doctor in both sectors. This subsample of cases revealed that the difference in length between NHS and private consultations was now reduced to an average of under 3 minutes. This was still statistically significant, although the significance was reduced. Finally, however, if I compared only new patients seen by die same doctor, NHS patients got 4 minutes more on average - 34 minutes as against 30 minutes in the private I4S PARI THREE l í .1 li i ANALYSING YOUR DATA clinic. This last rinding was not suspected and had interesting implications for the overall assessment of the individual's costs and benefits from 'going private'. It is possible, for instance, that the lighter sahtduling of appointments at the private clinic may limit the amount of time that can be given to new patients. As a further aid to comparative analysis, I measured patient participation in the form of questions and unelicited statements. Once again, a highly significant difference was found; on this measure, private patients participated much more in the consultation. However, once more taking only patients seen by the same doctor, tine difference between the clinics became very small and was not significant. Finally, no significant difference was found in the degree to which non-medical matters (e.g. patient's work or 'home circumstances) were discussed in tine clinics. , These quantitative data were a useful check on over-enthusiastic claims about the degree of difference between the NHS and private clinics. However, as I argued in Chapter 10, my major concern was with the 'ceremonial order' of the three clinics. I had amassed a considerable number of exchanges in which doctors and patients appeared to behave in the private clinic in a manner deviant from what we know about NHS hospital consultations. Tine question was: would the quantitative data offer any support to my observations? The answer was, to some extent, positive. Two quantitative measures were helpful in relation to the ceremonial order. One dealt with the extent to which the doctor fixed treatment or attendance at the patienfsrcönveniencc. The second measured whether patients or doctor engaged in polite small-talk with one another about their personal or professional lives. (I called this 'social elicitation'.) As Table 11.3 shows, both these measures revealed significant differences, in the expected direction, according to the mode of payment. Now, of course, such data could not offer proof of my claims about the different interactional forms. However, coupled with tine qualitative data, they provided strong evidence of tine direction of difference, as well as giving me a simple measure of the sample as a whole which contested tine few extracts of talk 1 was able to use. I do not deny that counting can be as arbitrary as qualitative interpretation of a few fragments of data. However, providing researchers resist the temptation to try to count everything, and base their analysis on a sound conceptual basis linked to actors' own TABLE 11.3 Privofe and NHS clinics: ceremonial ord en Private clinic )n = 42) Tfcalm'ent or attendance fixed at paiienti' convenience So: i e' el: c i tnlion 15|3óSí| 25 |Ó0S| Scute: Silverman, 1993: lóS 146 NHS Clinics (n = 104) 10(10%| 31(30%) ,1.1.1,1.1:1.11 DEVELOPING DATA ANALYSIS »it-methods of ordering the world, then each type of data can inform the analysis of the other. In Chapter 13,1 return to the role of counting as an aid to validity in oualitative research. In the case of observational studies, such counting will often be based on the prior coding of fieldnotes. I now, therefore, turn to the issues that arise in such coding. limits in coding fieldnotes fhe tabulations used in the cancer study derived from: that well-established style of work whereby the data are inspected for categories and instances. It is an approach that disaggregates the text (notes or transcripts) into a series of fragments, which are then regrouped under a series of thematic headings. (Atkinson, 1992:455) Such coding by thematic headings has recently been aided by computer-aided qualitative data analysis systems as discussed in Chapter 12. In larger projects, the reliability of coding is also buttressed by training coders of data in procedures which aim to ensure a uniform approach. However, there remain two problems with coding fieldnotes. The first, and more obvious, problem is that every way of seeing is also a way of not seeing. As Atkinson points out, one of the disadvantages of coding schemes is that, because they arc based upon a given set of categories, they furnish 'a powerful conceptual grid' (1992:459) from which it is difficult to escape. While this 'grid' is very helpful in organizing tine data analysis, it also deflects attention away from uncategorized activities. Therefore, as Clivé Seale (personal correspondence) has noted: 'A good coding scheme would reflect a search for 'uncategorized activities' so that they could be accounted for, in a manner Similar to searching for deviant cases/ The second, less obvious, problem is that, as 1 pointed out in Chapter 3, 'coding' is not the preserve of research scientists. All of us 'code' what we hear and see in the world around us. This is what Garfinkel (1967) and Sacks (1992) mean when they say that societal members, like social scientists, make the world observable and reportable. Put at its simplest, this means that researchers must be very careful how they use categories. For instance, Sacks quotes from two linguists who appear to have no problem in characterizing particular (invented) utterances as 'simple', 'complex', 'casual' or 'ceremonial'. For Sacks, such rapid characterizations of data assume 'that we can know that [such categories are accurate] without an analysis of what it is [members] are doing' (1992, Vol. 1:429). At this point, the experienced researcher might respond that Sacks has characterized conventional research as over-naive. In particular, most researchers are aware of the danger of assuming any one-to-one correspondence between their categories and the aspects of 'reality' which they purport to describe. Instead, following Weber (1949), many researchers claim that 147 i í ( I i ! * I i I í [ i I t f i PART THREE ANALYSING YOUR DATA they are simply using hypothetical constructs (or 'ideal types') which are only to be judged in relation to whether they are useful^ not whether they are accurate or true. I lowever, Sacks was aware of this argument. As he notes: It Is a very conventional way to proceed in the social sciences to propose that the machinery you use to analyze some data you have is acceptable if it is not intend-edly the analysis of real phenomena. That is, you can have machinery which is a 'valid hypothetical construct', and it can analyze something for you. (1992, Vol. 1: 315) By contrast, the 'machinery' in which Sacks is interested is not a set of 'hypothetical constructs'. Instead, Sacks's ambitious claim is throughout 'to be ■dealing with the real world' (1992, Vol-1: 316). The 'machinery' he sets out, then, is to be seen not as a set of more or less useful categories but as the actual categories and mechanisms that members use. In this sense, he points out; 1 intend tliat the machinery I use to explain some phenomenon, to characterize flow it gets done, is just as real as the (hing I started out to explain. (1992, Vol. 1:315, my emphasis) How should we respond to Sacks's radical critique of ethnography? The first point is not to panicí Sacks offers a challenge to conventional observational work of which everybody should be aware. In particular, Sacks's lecture 'Doing "being ordinary"' (1992, Vol. 2: 215-21) is essential reading for every fieldworker. However, awareness does not mean that everybody has to follow Sacks's radical path. So one response is to state something like 'thanks but no thanks'. For instance, 'grounded theory' is an equally respectable (and much more popular) way of theorizing (about) fieldwork. To this effective but essentially defensive manoeuvre, we can add two more ambitious responses. First, we can seek to integrate Sacks's questions about 'how' the social world is constituted with more conventional ethnographic questions about tine 'whats' and 'whys' of social life (Gubrium and Holstein, 1997). Or, second, as [ describe below, we can make this everyday 'coding' (or 'interpretive practice') the object of inquiry by asking 'how' questions about talk-in-interaction. TRANSCRIPTS AND DATA ANALYSIS The two main social science traditions which inform the analysis of transcripts of tapes are conversation analysis (CA) and discourse analysis (DA). For an introduction to CA, see len Have (1998); for DA, see Potter and WetHérell (1987) and Potter (1997). In this book we are, of course, more concerned with the practicalities of doing qualitative research. In the rest of this chapter 1 will, therefore, deal with two practical issues; 148 DEVELOPING DATA ANALYSIS • the advantages of working with tapes and transcripts • the elements of how to do analysis of such tapes. Why work with tapes? The kind of phenomena I deal with are always transcriptions of actual occurrences in their actual sequence. (Sacks, 1984:25) The earlier ethnographers had generally relied on recording their observations through fieldnotes. Why did Sacks prefer to use an audio recorder? Sacks's answer is that we cannot rely on our recollections of conversations. Certainly, depending on our memory, we can usually summarize what different people said. But it is simply impossible to remember (or even to note at the time) such matters as pauses, overlaps, inbreaths and the like. Now whether you think these kinds of things are important will depend upon what you can show with or without them. Indeed, you may not even be convinced that conversation itself is a particularly interesting topic. But at least by studying tapes of conversations, you are able to focus on the 'actual details' of one aspect of social life. As Sacks put it; My research is about conversalion only in Ulis incidental way, that we can get the actual happenings of on tape and transcribe them more or less, and therefore have something to begin with. If you can't deal with the actual detail of actual events then you can't have a science of social life. (1992, Vol. 2:26) Tapes and transcripts also offer more than just 'something to begin with'. In the first place, they are a public record, available to Ihe scientific community, in a way that fieldnotes are not. Second, they can be replayed and transcriptions can be improved and analyses taken off on a different tack unlimited by the original transcript. As Sacks told his students; I starred to play around with tape recorded conversations, for the single virtue lhal I could replay thcrn; that 1 could type them out somewhat, and study them ex-tendedly, who knew how long it might take... It wasn't from any large interest in language, or from some theoretical formulation of what should be studied, but simply by virtue of that; I could get my hands on it, and I could study it again and again. And also, consequentially, others could look at what I had studied, and make of it what they could, if they wanted to disagree with me. (1992, Vol. 1: 622) A third advantage of detailed transcripts is that, if you want to, you can inspect sequences of utterances without being limited to Ihe extracts chosen by the first researcher. For it is within these sequences, rather than in single turns of talk, that we make sense of conversalion. As Sacks pointa out; having available for any given utterance other utterances around it, is extremely important for determining what was said. If you have available only Ihe Snatch of talk that you're now transcribing, vou're in tough shape for determining what it is. (1992, Vol. 1: 729) 149 t__í t__I i. PART THREE - ANALYSING YOUR DATA It should not be assumed that the preparation of transcripts is simply a technical derail prior to the main business of the analysis. The convenience q' transcripts tor presentational purposes is no more mMi an added bonus. As Atkinson and Heritage (1984) point out, the production and use of transcripts are essentially 'research activities'. They involve close, repeated listenings to recordings which often reveal previously unnoted recurrino features of the organization of talk. Such listenings can most fruitfully be done in group data sessions. As described by Paul ten Have (1998), work in such groups usually begins by listening to an extract from a tape with a draft transcript and agreeing upon improvements to the transcript. Then: ^ the participants are invited to proffer some observations on the data, to select an , episode which they find 'interesting' for wliatever reason, and formulate their understanding or puzzlement, regarding that episode. Then anyone can come in to react to these remarks, offering alternatives, raising doubts, or whatever, (1993. However, as ten Have makes clear, such group data sessions should be rather more than an anarchic free-for-all: participants are, on the one hand,/r# to bring in anything they like, but, on the other hand, required to ground their observations in the data at hand, although they may also support them ivith reference to their own data-based findings or those published in the Literature. (1998: ibid) Analysing tapes There is a strongly inductive bent to the kind of research that ten Have and Sacks describe. As we have seen, this means that any research claims need to be identified in precise analyses of detailed transcripts. It is therefore necessary to avoid premature theory construction and the 'idealization' of research materials which uses only general, non-detailed characterizations. Heritage sums up these assumptions as follows: SpecificaUy, analysis is strongly 'data-driven' - developed from phenomena which are in various ways evidenced in the data or interaction. Correspondingly, Üiere is a strong bias against a priori speculation about the orientations and motives of speakers and in favour of detailed examination of conversationalists' actual actions. Thus the empirical conduct of speakers is treated as the central resource out of which analysis may develop. (198í: 243) In practice. Heritage adds, this means that it must be demonstrated that the regularities described can be shown to be produced by the participants and attended to by them as grounds for their own inferences and actions. Further, deviant cases in which such regularities are absent must be identified and analysed. 150 [ 1 I k I . I t I i I ■ t I t 1 1 I DEVELOPING DATA ANALYSIS 'íl- However, the way in which CA obtains its results is rather different frorri w we might intuitively try to analyse talk. It may be helpful, therefore, if conclude this section by offering a crude set of prescriptions about things do and tilings to avoid in CA. These are set out in Tables 11.4 and 11.5. TABLE 11.4 How to do CA 1 Always Iry lo identify sequences of related talk 2 Fry to examine how speakers lake on certain foles or identities ihrough their talk (e.g. oueslioner-onswerer o' clicnl-prcíessionol| 3 Look for particular outcomes in the talk (e.g. o request for clotificciion, a repair, laughter) and work bockwards 5o trace the trajectory through which a particular outcome was produced Sotm». Silverman, 1998b TABLE 11.5 Common errors in CA 1 Explaining a turn ot talk by reference lo the speaker's intentions 2 Explaining a turn c- talk by reference lo a speaker's role or status (e.g. as a doctor or as a men or woman) 3 Trying to make sense of a single line of transcrip* or utterance in isolation from ihe surrounding talk_________^_______________________________________________________ Source: Silverman, 1996b If we follow these rules, the analysis of conversations does not require exceptional skills. As Schegloff puts it, in his introduction to Sacks's collected lectures, all we need to do is to "begin widt some observations, then find the problem for which these observations could serve as ... the solution' (Schegloff in Sacks, 1992, Vol. 1: xlviii). This means that doing the kind of systematic data analysis that CA demands is not an impossibly difficult activity. As Harvey Sacks once pointed out, in doing CA we ate only reminding ourselves about things we already know. As Sacks remarks: I take it that lots of the results 1 offer, people can see for themselves. And they needn't be afraid to. And they needn't figure mat the results arc wrong because they can see them ... (It is] as if we found a new plant. It may have been a plant in your garden, but now you see it's different than something else. And you can look at it to see how ifs different, and whether it's different in the way that somebody has said. (1992, Vol. 1:488) CONCLUDING REMARKS Using the examples of tapes and fieldnotes, we have seen how data analysis can be developed after the first stages. However, as I have implied throughout, good data analysis is never just a matter of using the right methods or 1 51 rrr-i.i.itiii PARI THREE ANALYSING YOUR DATA ■techniques but is always based on theorizing about data using a consistent model of social reality. This commitment to theorizing abi^l data makes the best qualitative research far superior to the stilted empiricism of the worst kind of quantitative research. However, theorization without methodological rigour is a dangerous brew. In Chapter 12, we consider how computer software can aid qualitative research. Then, in Chapter 13, the issues of validity and reliability are discussed. SUMMARY —r~--------------------------------------------------------------------------------------------- DSta analysis can be developed in five ways: 1 Focus on data which are of high quality and are easiest to collect. 2 Focus on one process within those data. 3 Narrow down lo one part of that process. 4 Compare different subsamples of the population using the comparative method. 5 Generate topics with a scope outside the substantive area of the research. Glaser and Slrauss's (1967) famous account of grounded-theory offers one way of developing analysis of observational data. It involves three stages: 1 an initial attempt lo develop categories which illuminate the data 2 an attempt to 'saturate' these categories with many appropriate cases in order to demonstrate their relevance 3 developing these categories into more general analytic frameworks with relevance outside the selling. Developing simple counting mechanisms can be a further useful way of identifying deviant cases and thereby developing generalizations. However, you should resist the temptation to try to count everything and try to base your analysis on a sound conceptual footing - often linked to actors' own methods of ordering the world. If you want to try conversation analysis on transcripts, follow these rules: 1 Always try to identify sequences of related talk. 2 Try(ito examine how speakers take on certain roles or identities through their talk (e.g. questioner-answerer or client-professional). 3 Look for particular outcomes in the talk (e.g. a request for clarification, a repair, laughter) and work backwards to trace the trajectory through which a particular outcome was produced. 152 DEVELOPING DATA ANALYSIS Further reading Miles and Huberman's book Qualitative Data Analysis (Sage, 1984) provides a useful treatment of coding observational data. Fot a more recent discussion, soo Robert Emerson et al.'s Writing Ethnographic Fioldnotos (University of Chicago Press, 1995}. Hammcrsley and Atkinson's Ethno-graphy: Principles in Practice (Tavistock, 1983), Chapters 7-8, is a classic discussion of how to analyse ethnographic data. A development of some of these ideas can be found in Martyn Hammersle/s Whaťs Wrong with Ethnography? Methodological Explorations (Routledge, 1992). A relatively recent treatment of 'grounded theory1 is to be found in Strauss and Corbin's Basics of Qualitative Research ISage, 1990). Sacks's work on conversation analysis is discussed in David Silverman, Harvey Sacks: Social Science and Conversation Analysis (Polity, 1998). The case studies of the cancer and heart clinics discussed hero aro found in David Silverman, Communication ■ and Medical Practice (Sago. 1987), Chapters 6-7. Exercise 11.1 This exercise is based on the various ways to develop data analysis discussed in this chapter. With reference to your own data: 1 Focus on one process within those data. Now narrow down your focus to one part of that process. Survey your data in terms of this narrow focus. What can you now find? 2 Compare different subsamples of your data in terms of a single category or procoss. What does this show? 3 Docido what features of your data moy properly be counted and tabulate instancos of a particular category. What does this tabulation indicate? Identify 'deviant" cases and explain what you will do with them. 4 Attempt to develop your categories into moro general analytic frameworks with relevance outside the setting you are studying. 153