Social identities and risk: expert and lay imaginations on pesticide use Anders Blok, Mette Jensen and Pernille Kaltoft Expert-based environmental and health risk regulation is widely believed to suffer from a lack of public understanding and legitimacy. On controversial issues such as genetically modified organisms and food-related chemicals, a “lay–expert discrepancy” in the assessment of risks is clearly visible. In this article, we analyze the relationship between scientific experts and ordinary lay citizens in the context of risks from pesticide usage in Denmark. Drawing on concepts from the “sociology of scientific knowledge” (SSK), we contend that differences in risk perception must be understood at the level of social identities. On the basis of qualitative interviews with citizens and experts, respectively, we focus on the multiple ways in which identities come to be employed in actors’risk accounts. Empirically, we identify salient characteristics of “typical” imagined experts and lay-people, while arguing that these conceptions vary identifiably in-between four groups of citizens and experts. On the basis of our findings, some implications for bridging the lay–expert discrepancy on risk issues are sketched out. 1. Introduction: the “lay–expert discrepancy” on environmental risks During the past 30 to 40 years, environmental risks have emerged as more or less permanent causes of serious concern amongst citizens of Western late-modern “risk societies” such as the Danish. From industrial pollution, chemicals and atomic energy issues of the 1970s through to more “global” issues of ozone depletion, biodiversity and climate change in the 1990s, risks to the environment are never far from media headlines and public consciousness. Meanwhile, political risk management efforts have expanded significantly, relying mostly on various forms of scientifically trained expertise. Through techniques of risk assessments, experts often come to define the boundaries of environmental problems as well as their proper solution. Inevitably, citizens’ concerns with environmental risks correspondingly involve some form of engagement with environmental experts. The expert-based environmental risk regulation is, however, widely believed to suffer from a lack of public understanding and legitimacy. In particular, numerous studies have documented that scientifically trained experts tend to perceive environmental and health related risks differently from the way lay-people do (Kraus et al., 1992; Sjöberg, 1998; Hansen et al., 2003). In most cases, experts tend to have lower perceptions of risk than the general public, as © SAGE Publications ISSN 0963-6625 DOI: 11.1077/0963662506070176 SAGE PUBLICATIONS (www.sagepublications.com) PUBLIC UNDERSTANDING OF SCIENCE Public Understand. Sci. 17 (2008) 189–209 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 189 190 Public Understanding of Science 17 (2) also evidenced in public controversies on issues such as bovine spongiform encephalopathy (BSE), genetically modified organisms (GMOs) and food-related chemicals. Experts on their part often attribute public anxieties to people’s lack of relevant knowledge and susceptibility to emotional fears. A “lay–expert discrepancy” in the assessment of environmentally related risks is thus clearly visible, and is increasingly recognized as a significant problem for public risk regulation agencies to consider. In this paper, we analyze the relationship between scientific experts and ordinary lay citizens in the context of a particular environmental and health issue: risks from pesticide usage in agriculture and private homes.1 Empirically, the analysis is based on 16 qualitative interviews, eight with pesticides experts and eight with ordinary citizens, as well as two focus group interviews.2 The interviews revolve around issues of risk perception, regulation and general attitudes towards pesticide usage. In this paper, however, we abstract somewhat from these issues, focusing instead on the multiple ways in which identities of “experts” and “laypeople” are used and understood in the interviews. More specifically, we analyze how experts and citizens, respectively, conceive of or model the roles and competencies of “experts” and “lay-people” in the context of pesticides risks. We will talk about these discursive models as “imagined experts” and “imagined lay-people,” and try to map out their characteristics.3 By analyzing how experts and lay-people conceive of each other and themselves, we hope to gain a better understanding of the terms on which lay–expert risk dialogue is presently conducted. On the analytical level, the main assumption is that, in order to understand the lay–expert discrepancy in risk perception and communication, attention must be paid to the question of social identities (Wynne, 1996b: 42). Our analysis thus draws heavily on concepts and insights developed within the tradition of “sociology of scientific knowledge” (SSK), in particular as this connects to environmental sociology and the environmental arena more broadly. Within this framework, we refer to studies of lay-people’s uptake of scientific forms of reasoning, in particular in relation to broader issues of “local” knowledge, identities, and trust in experts and their institutions (e.g. Irwin et al., 1996, 1999). Additionally, we borrow analytical concepts from studies of experts’ and expert institutions’ implicit understanding of laypeople as non-knowledgeable and emotional, often dubbed the “deficit model” (Wynne, 1989, 1992). While the first strand of studies is usually based on qualitative studies, actually interviewing experts in order to understand their views on lay-people, as in this study, is much less common. Through this “methodological equality” approach to experts and lay-people, we aim to address the lay–expert relationship in a direct, non-hierarchical way.4 The paper falls into seven main sections, including this introduction. In the second section, we briefly introduce central theoretical issues relating to SSK, experts, and the “public understanding of science” framework. Continuing from this, the third section develops an analytical matrix for understanding imagined experts and imagined lay-people. Sections 4 and 5 present main empirical findings from interviews with lay-people and experts, respectively, while these strands are drawn together to address the lay–expert relationship directly in Section 6. The final section summarizes the main conclusions. 2. Experts, PUS and the “deficit model” As an environmental and health issue, risks from pesticides usage belong to the realm of “technical decision-making,” in the sense that science and technology intersect here with political issues of clear relevance to the public (Collins and Evans, 2002). Notably, pesticides may cause groundwater contamination, contribute to losses of biodiversity, cause workrelated illnesses to farmers, and present health risks to consumers in the shape of pesticides 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 190 residues in food. In each of these specific areas, different risk regulation experts rely mainly on various quantitative risk assessment methods, as a basis for risk management and communication (Hansen et al., 2003: 112). SSK analysts refer to this as “regulatory science”: highly complex institutional networks in which experts conduct technical research in response to the requirements of governments and industries such as the agrochemical sector (Irwin and Rothstein, 2003). Lay-people usually have little or no access to such expert institutional settings, thus mainly encountering experts via the mass media. Experts are clearly ascribed “cognitive authority” (cf. Turner, 2003: 24ff.) to manage pesticides risks on behalf of society, and risk communication is primarily one-way from experts to lay-people. However, faced with apparently growing public resistance to technical and environmental change, public authorities have generally sought new ways of dealing with the wider public. One early and still dominant response has of course been a widening concern with the “public understanding of science” (PUS) (Irwin and Michael, 2003).5 More often than not, PUS research has been informed by a “knowledge gap” view: the main underlying assumption has been that mistrust towards science must be generated by public ignorance (Hansen et al., 2003: 112).6 To combat such public ignorance, the solution has often been straightforward: educating the public and communicating a “realistic,” i.e. science-based, perception of risks (Horst, 2003: 20). As pointed out by critics within the SSK community, most notably Brian Wynne, this approach amounts to a “deficit model” of a public lacking in scientific knowledge (e.g. Wynne, 1989, 1992). As he and others have elaborately argued, this model relies on several problematic normative assumptions. It systematically downgrades other forms of knowledge, such as experiences and skills, seen as “distorted” and subjective (Irwin and Michael, 2003: 26f.), while simply assuming that expert knowledge, i.e. of pesticides risks, is “correct” and objective (Hansen et al., 2003).7 While still clearly present, in recent years the deficit view of lay–expert relations may be slowly changing, at least as the sole foundation for risk management. As public confidence in scientific risk management has apparently become more fragile, authorities are increasingly using the language of public “dialogue” and “engagement” with science and technology (Irwin and Michael, 2003: 47ff.).8 In the context of Denmark, where this research project is situated, such tendencies have long been present.9 Recently, in a symptomatic publication on risk communication guidelines for the Danish environmental protection agency, dialogue and public engagement is highlighted as necessary. Within this “dialogue model,” a conception of lay-people different from the deficit view is propagated, in that lay-people’s knowledge and experience are seen to contain valuable insights for problem solving (EPA, 2004: 56).10 In what follows, we stage such a lay–expert dialogue within the specific frame of risk perceptions of pesticides, by exploring in more depth the different models of “lay-people” and “experts.” We do so firstly by establishing an analytical map of lay and expert competencies. 3. Imagined experts, imagined lay-people Construing lay-people along the lines of the “deficit model” represents a powerful social tool. There are other, very different, ways in which lay-people may come to be defined and enrolled, however. Correspondingly, expert roles and competencies may be construed in various ways, ranging from the cognitively authoritative, non-biased fact-finders of PUS to a more “humble,” if still necessary, partner in public dialogue. The important point here is that, as highlighted by SSK analysts, the very categories of “lay-people” and “expert” are far from unambiguous. Rather than being fixed and stable, these role positions may be seen as the outcome of processes of negotiation and boundary setting, taking place in social contexts.11 Blok et al.: Social identities and MSK 191 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 191 However, relatively little is known about how experts and lay-people conceive each other’s identities and competencies. This is what we address by the concepts of imagined experts and imagined lay-people. In this context, imagined experts and imagined lay-people refer to conceptions of experts and lay-people, respectively, as they are expressed, displayed or manifested by members of each group. Imagined experts and lay-people need not bear full resemblance to real experts and lay-people. Rather, they describe how real experts and lay-people imagine typical or representative actors in these groups. In other words, imagined experts and imagined lay-people are mental models, as manifested by actors through social and discursive practices. Such practices may take various forms, including the writing of scientific advisory reports, communicating in the mass media, or establishing science exhibitions (Maranta et al., 2003). Since our empirical material consists of qualitative interviews, our uses of the concepts refer to discursive manifestations in this context. Our analysis relies, however, on the notion that mental models may carry social and political consequences. For instance, the “deficit model” represents in these terms a particularly dominant example of experts’ imagined lay-people, with strong policy implications. Put schematically, we distinguish between four types of mental models: 1) experts’ imagined lay-people, 2) experts’ imagined experts, 3) lay-people’s imagined experts, and 4) laypeople’s imagined lay-people. In our analysis, categories 1 and 3 thus represent the “other-directed” ascribed role positions of lay-people and experts, whereas categories 2 and 4 represent the “self-directed” identities of these groups. To organize our analysis in a coherent fashion, we treat each of the four categories according to a common analytical matrix, distinguishing between two fundamental attributes of imagined experts and lay-people: “knowledge” and “values.” We consider each as a continuum, from possessing no to possessing all relevant knowledge and/or values. Distinguishing between “knowledge” and “values” obviously relies on a centuries-long philosophical tradition, and more recently the knowledge–value relationship has become a central aspect of discussions on democratic environmental management (Pellizzoni, 1999).12 More importantly in this context, it proves a valuable heuristic tool for analyzing actors’ conceptions of expert and lay-people competencies. The analytical matrix may be depicted as in Figure 1. As indicated in the matrix, the models of lay and expert identities as found in PUS discussions embody particular ideas on the distribution of knowledge and values. The standard PUS rendering tends to implicitly see experts as possessors of all knowledge relevant to the issue and at the same time as objective and “unbiased” by values, a position termed here the “PUS expert” model. Correspondingly, the “deficit lay” model, as already described, represents lay-people as possessing very limited relevant knowledge, while being subjectively “biased” by emotions and values. These two polar positions may be seen as components of a “classical” lay–expert relationship. Alternative conceptions are illustrated in the matrix, however, by the “dialogue model.” The basic idea of the “dialogue” lay and expert positions is that, as partners in mutual exchange, lay-people and experts are imagined as closer to each other in competencies. The exact positioning of these imagined lay-people and experts in the matrix is not the main point here. Rather, the idea is to illustrate the variety of different lay and expert models, and how in broad terms these differ from each other in terms of ascribed knowledge and value attributes. Furthermore, these four models are by no means meant to be exhaustive of the possible positions. Throughout the empirical analysis in the next sections, more positions will be added. This analysis will be an explorative endeavor, pointing to broad tendencies in our material, but obviously unsuited to draw hard-and-fast, statistically reliable generalizations.13 We turn first to the lay perspective. 192 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 192 4. Lay-people’s imagined experts and lay–expert relations Throughout the early 1990s, environmental and health risks from pesticides became a muchdebated issue in the Danish mass media, as well as amongst government experts and regulators.14 Since then, mass-mediated “story-lines” (cf. Hajer, 1995) on pesticides-related groundwater contamination and pesticide residues in food have been a recurrent phenomenon. In this context, it comes as little surprise that lay-people in our interviews would, generally, express some or strong concerns with the environmental and health implications of present levels of pesticides usage in society. However, given the concomitant “politicization” of pesticidesrelated risk sciences, it is likewise unsurprising that doubts and concerns would be expressed as to the trustworthiness and competencies of pesticides experts in general. Regardless of differences on other issues, all lay respondents made comments on the limitations, uncertainties and/or ignorance associated with experts’ knowledge of risks from pesticides. Lay-people tend to see knowledge as something that will never be one hundred percent certain, even if we should strive for as much knowledge as possible: I really think one ought to put a lot of research effort into this [pesticides risk], so that we are better prepared. I mean, one really needs to take precautions to know as much as possible. But obviously we cannot know everything. (Jens) Knowledge and expertise is seen by most respondents as principally limited, notably by the fact that you cannot predict the long-term consequences of current activities. Individual experts, on the other hand, are seen as capable of possessing only limited knowledge or knowledge of a limited aspect of the wider issue. In short, in terms of the knowledge dimension, lay-people’s imagined experts clearly fail to meet the “PUS expert” level of all-encompassing competence. Even when displaying high levels of trust, lay respondents do not expect total or unrestricted knowledge from experts. Blok et al.: Social identities and MSK 193 –values +values “PUS expert” “Dialogue expert” +knowledge –knowledge “Dialogue lay” “Deficit lay” Figure 1. Matrix of analytically derived lay and expert positions. 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 193 This, however, in no way entails a general dismissal of the need for experts amongst lay respondents. On the contrary, lay-people in our interviews clearly see themselves as in important ways dependent upon experts, widely seen as the kind of people who perform the task of keeping citizens safe from undue risks of pesticides. When talking about the category of “expert,” lay-people generally tend to think of government or industry employed regulators, who perform control functions for the state and, by implication, for lay citizens. In many cases, lay respondents make no clear distinction between experts, regulators and politicians, viewing them all as part of a “system” of regulation. Notably, very few respondents seem to think of “experts” only as the stereotypical, university-employed specialist scientist. Rather, lay-people seem to have grown accustomed to the reality of “regulatory science,” with much less distinct boundaries between science, industry and government regulation. As a corollary of this conceptualization of “experts,” lay-people clearly view the institutional affiliation of different experts as relevant to the credibility of their knowledge claims. Typically, respondents make a distinction between privately and publicly employed experts, who differ in the extent to which lay-people ascribe material interests, and thus problematic value biases, to them. The most obviously “interested” expert is the economically dependent industry expert, who most lay-people seem to distrust somewhat. However, several respondents tend to see environmental non-governmental organizations (NGOs) as too one-sided and value-committed as well. When dealing with such interest-biased experts, lay-people mostly become skeptical and put more emphasis on their own judgments: Especially when we are dealing with things like interest organizations and such, then I become skeptical, really. Then you need to use your common sense and ask: is this necessary or what. (Ulrik) Following the same line of reasoning on interests, most respondents see publicly employed experts as more trustworthy than others, since they are more responsive to public needs and wishes, even if they are not infallible. These three important characteristics of lay-people’s imagined experts—their principally limited knowledge claims, their involvement in government regulation, and the untrustworthiness of “interested” experts—emerge with little variation from almost all the interviews. In other words, they seem to point to a widely held or “dominant” model of imagined experts amongst the Danish public at large. When analyzing in more details, however, marked differences between lay-people become apparent. In the following sections, we will distinguish between two dominant strands in how our lay respondents construe the lay–expert relationship, which we will refer to as the “pro-scientific” and “fatalist” lay positions, respectively (Horst, 2004).15 Notably, this pattern reflects differences in education or “cultural capital” (cf. Bourdieu, 1984), with the more “pro-scientific” lay respondents generally possessing stronger educational qualifications than the “fatalistic” group do. We turn now to a description of these two “lay cultures of expertise.” “Pro-science” and “fatalist” views: trustworthy experts? So far, our picture of lay-people’s imagined experts seems broadly consistent with previous sociological discussions on lay criteria for judgments of science and expertise (e.g. Lidskog, 1996; Wynne, 1996b). In this section, we add further layers to this picture, by distinguishing between “pro-scientific” and “fatalist” imagined experts. Put schematically, we argue that more “pro-scientific” lay-people generally express greater trust in experts, are more optimistic towards cognitive progress in science, and construe more “active” forms of lay identities in relation to expert systems. Conversely, more “fatalist” lay-people show limited trust in experts, tend to see expert disagreements as cognitively “illegitimate,” and express a sense of 194 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 194 negative dependency on expert systems. We will go through these points in turn, highlighting some of the main differences. As has been frequently documented in the literature, lay-people’s trust in experts is a main factor in explaining their risk perceptions and environmental attitudes (Coppin et al., 2002; Siegrist et al., 2000). In general terms, our interviews confirm this assertion, although in complex ways. At one end of the spectrum, we encounter an almost unconditional trust in the regulatory science system: Yes, in my view we have a good and well-functioning regulatory apparatus in this country, which actually works, contrary to many other places I think. I am fairly confident about this system … We have politicians to deal with this stuff. (Claus) While this level of trust is unusual, respondents in the “pro-scientific” group would generally express at least some conditional forms of trust, particularly in governmentally employed experts. This would be true even of respondents who combine this sense of conditional trust with a perception of serious risks from pesticide usage. In contrast to this basic position, amongst the “fatalistic” lay respondents, a clear concern about pesticide risks combines with very limited forms of trust in experts as such: I am probably a skeptical person generally. I don’t know. It’s probably also that I’m 64 years old now, so I have experienced a bit of everything. You gain some experience, so I’m difficult to fool into believing this or that. I want to see it before I believe it. (Lise) In most instances, mistrust in expertise is set against common experiences, shared by all laypeople, of experts disagreeing or saying different things in public over time in connection to environmental risks.16 One “fatalist” lay response to this experience is to stress a critical identity, as in the quote above: one should not accept all one hears from experts, since more often than not, it will change later on. Others in the “fatalistic” group even question the possibility of ever trusting experts whom you do not know personally.17 Most lay respondents, including both “pro-scientific” and “fatalist,” see expert disagreements as a source of personal uncertainty: you do not know what to believe and you start doubting things. How lay-people make sense of and deal with expert disagreements, however, varies widely. Amongst the more “fatalist” lay-people, expert disagreements are seen as reason for distrusting experts in general, since disagreeing experts cannot all be right, and since as a social group, experts change opinions over time. Characteristically, these viewpoints tend to imagine experts as a homogeneous and collective social category. In contrast, more “proscientific” lay respondents tend to construe expert disagreements as more cognitively “sensible” outcomes. This lay group relates disagreements to the imagined plurality and heterogeneity of experts: experts work from different assumptions and perspectives, or they research different specialized aspects of a larger problem. Changing expert claims may thus even be construed as signs of “progress”: But they [experts] have not told any lies. So if in a month from here, someone comes along and says that this [low-fat food product] causes cancer, well, then it still only contains small amounts of fat, but you also get cancer from it. In that sense, they are not lying. But you just gain new knowledge all the time. (Stine) As this quote shows, perceiving sensible reasons for conflicting expert claims can make such disagreements less “destabilizing.” Such interpretations are quite common amongst the “proscientific” group, compared to the “fatalist” group. Furthermore, lay judgments on experts’ trustworthiness entail an element of reflexive self-identity: how you “perform” trust in experts in part relates to how you imagine your own Blok et al.: Social identities and MSK 195 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 195 role (Michael, 1996). In most of our interviews, respondents would express some form of reservation as to their own understanding of the scientific aspects of pesticide risks, and in this way position themselves in relation to scientific expertise.18 Broadly speaking, “fatalistic” lay-people seem to combine a sense of dependency on experts with a privately held, experiencedbased and “intuitive” sense of mistrust towards experts’ cognitive authority.19 In contrast, the “pro-scientific” lay group seems to employ their understanding of “scientific” risk reasoning— including its limitations—in imagining more active forms of lay identities in relation to experts. Several respondents talk, for instance, about the lay responsibilities in connection to political consumption. In this sense, “pro-scientific” lay-people may be said to imagine lay–expert relations in cooperative terms: experts are conditionally trusted to do the primary risk reduction work, while as lay-people, you contribute by taking responsibility for your personal choices. Summing up: plurality of imagined experts To summarize this discussion on lay-people’s imagined experts, one may point to three important features of the “dominant” model. Firstly, lay respondents imagine experts’ knowledge as being principally limited and fallible, thus falling short of all-encompassing competence. Secondly, the very category of “expert” is imagined as an integral part of a “regulatory science” system, implying some form of social commitment and thus falling short of complete value-neutrality. Finally, the institutional affiliations and social interests of experts are seen as crucial, with lay-people trusting government rather than NGO or industry experts owing to the material or value interests of the latter two. Turning then to the two outlined lay cultures of expertise, one should highlight the crucial role of trust. While the “pro-scientific” lay position retains some conditional trust in the scientific knowledge of experts, “fatalistic” lay-people are less convinced of its usefulness. Still, both groups acknowledge some “ignorance” on the part of lay-people in relation to scientific risk reasoning, making for some form of cognitive dependency on experts. Within the “pro-scientific” group, however, lay actors are imagined as more active and capable of handling risk issues than is the case amongst the “fatalistic” group. 5. Experts’ imagined lay-people and publics In our study, experts were selected for interviewing on the basis of two primary criteria: firstly, their specialization into health or environmental risks from pesticides, and secondly, their institutional affiliations, with representatives of academia, governmental research, governmental regulation, the agrochemical industry, agriculture, water supply and environmental NGOs.20 Accordingly, these experts represent a broad variety of perspectives on, and social commitments to, the issue of pesticides-related risks. As we will set out in this section, they also embody a variety of views on lay–expert relations, embedded as they all are in forms of risk communication. None of the experts of our study would defend the view that, as an essentially “technical” matter, what lay-people think about pesticides risk is irrelevant. More often than not, however, lay-people’s lack of insight would emerge as a barrier to rational decision-making. One important way of approaching these issues is to note how experts talk about “lay” or “common” people in relation to various different “imagined” public constituencies (Michael and Brown, 2000).21 When referring to imagined lay-people, experts usually do not mean to address the entire population in an unspecified way. Rather, they refer to imagined sub-publics, which seem to be largely functional constructs of the institutional setting of the expert. Thus, 196 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 196 to the environmental protection agency expert, lay-people are largely understood as either farmers, subjects of occupational risks, or consumers, subjects of food-related risks: But lay-people are also very different here, I suppose, so it depends on who you consider to be lay-people. If you also include the users … the farmers themselves, then their attitudes will obviously be very, very different from asking the one who is going to eat the grain, right. I’m absolutely sure. (Anita) The interesting thing here is not so much the assertion of value differences between farmers and consumers. Rather, the point is that, from this institutional expert position, farmers and consumers constitute the only imaginable lay publics, and these are in turn imagined as internally homogeneous groups. Quite similar points can be made about all the experts, relating their definitions of lay-people to their institutional setting. As will be explored below, this becomes highly relevant to the kind of “ignorance” or “competence” ascribed to lay-people by experts. Unsurprisingly, we find across most of the expert interviews certain references to laypeople’s lack of relevant pesticides-related knowledge. For instance, the environmental protection agency expert comments on all the things “regular consumers” do not know about production processes and others refer to “people” not knowing much about health risks from chemicals in general. Other experts, particularly the researchers and the industry representative, make more general statements on the lack of cognitive competencies amongst all or most lay-people. For instance, the academic researcher comments that “most people” do not form perceptions on the basis of knowledge but on “attitudes.” The industry expert likewise believes that lay-people generally behave “irrationally” in risk affairs. Clearly, the “deficit” view of cognitive non-competence emerges as a fairly dominant model of experts’ imagined lay-people. As a corollary of this “deficit” model, several expert statements point to the view that laypeople, while insufficiently knowledgeable, are excessively emotional and value-laden in risk issues. Being too emotional implies being vulnerable to false impressions, in particular as portrayed in the mass media: Well, I would say we have a continuum from the most ignorant part, or ignorant is not a nice word, but let’s call it the most impressionable … group of lay-people, and then up to people in the expert field who can potentially relate to it [pesticides] in a critical way. (Flemming) Experts commonly imagine lay-people as easily frightened in risk matters. As one governmental researcher states, the easiest thing is to “create fear” amongst the lay public. Other attributes frequently associated with imagined lay-people are words such as “scared,” “panic,” “easily concerned” and “confused.” Implicitly, such images of unstable lay-people form a contrast to the clear-headed, objective expert. Importantly, however, this model is not all-dominating in expert discourses. In almost all interviews, certain traces can be found of more competently imagined lay-people, contradicting somewhat the “dominant” view. In most cases, this involves ascribing more cognitive competencies to specific sub-groups of lay-people. Importantly, more competently imagined laypeople are usually sub-groups with whom expert respondents have some actual experience in interacting. Thus, the agricultural consultant clearly ascribes more competence to farmers and rural dwellers than to urbanites. In particular, she talks about farmers in terms implying cooperation, mutual benefit and understanding, giving this particular sub-group a kind of “lay expert” status. The same is true for the governmental researcher (Finn), when he comments on the “highly reasonable” (his wording) viewpoints of farmers and local municipality officials Blok et al.: Social identities and MSK 197 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 197 with whom he interacts regularly. In short, while the “deficit” view dominates, other imagined lay-people are also present in experts’ discourses, reflecting a distinction between “purely imagined” and “actually experienced” lay-people. Likewise, some expert respondents move beyond the “deficit” view of imagined laypeople by ascribing certain “moral” or “value-based” competencies to at least sub-groups of lay-people. Thus, in the case of the water supply expert, she reflects on her own difficulties in arguing against lay-people’s reasonable objections to some specific aspects of the groundwater limit value regulatory or policy system.22 She thus hints at the possibility that lay-people may in specific instances be more competent in making value judgments than she herself is as an expert. Even more explicitly, the academic researcher, generally critical towards pesticides usage, identifies with the “healthy criticism” of lay-people towards the often too selfassured public statements of industry and agriculture experts: If you just read your newspaper carefully, say a teacher who has to follow the news, right, then you will say: what is this? Something is rotten here. These are the kinds of things moving you, right, and then of course you become critical. … I think it’s quite understandable that lay-people reach this conclusion. (Flemming) In these cases, specific lay-people are imagined as carriers of “good sense,” making use of their value orientations, in sharp contrast to the “deficit” picture of excessive emotionality. Summing up from the discussion so far, expert respondents’ imagined lay-people can be said to possess certain “dominant” characteristics: they are functional constructs of experts’ institutional position, they are seen as cognitively non-competent, and they emerge as emotional, vulnerable to media manipulation and irrational scare. However, this “dominant” model is modified by the same experts when they make a distinction between “purely imagined” and “actually experienced” sub-groups of lay-people. The latter group is ascribed both more cognitive and more “value-based” competence for risk judgments. On these points, variations between experts’ views are matters of degree rather than kind. However, when including the issue of experts’ imagined experts, more marked differences between positions become apparent. Analogous to the lay side, we will distinguish two such overall patterns of experts’ imagined lay–expert relations, referred to as the “bureaucrat” and “partisan” expert positions, respectively.23 These two “expert cultures of lay–expert relations” are described next. “Bureaucrat” and “partisan” experts: the place of values Importantly, the way we distinguish between experts corresponds to a distinction in their institutional affiliations. “Bureaucrat” experts are seen here as occupying positions within or closely affiliated to the pesticides regulatory system, which in our study translates to the majority of interviewed experts.24 In contrast, “partisan” experts refers to the NGO and academic contexts, seen as more “marginal” positions. In overall terms, our study confirms the assertion that institutional affiliation is an important predictor of experts’ risk perceptions (Slovic et al., 1995; Slovic, 1999). Expert respondents associated here with the “bureaucrat” position, with some connection to the regulatory system, all express variations of the overall view that, in the present Danish context, risks from pesticides are under strict, reliable control. Most poignantly, the environmental protection agency expert refers to the present use of pesticides as “acceptable” and “safe.” Contrary to this overall view, “partisan” experts express much more pesticidecritical perceptions, relating mostly to biodiversity loss and nature conservation in general. In this section, we argue that this overall pattern of institutional affiliation and risk perceptions is connected to differences in experts’ imagined lay–expert relations, and in particular to different self-conceptions as experts. Put briefly, whereas “bureaucrat” experts are somewhat 198 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 198 reluctant to talk about experts as carriers of values, “partisan” experts are more inclined to explicitly reflect on values embedded in expert knowledge (cf. Sundqvist, 1992: 215ff.). Furthermore, “bureaucrat” experts in particular maintain a sharp boundary between “research” as what they do and “politics” as left to others. For instance, as one governmental expert advisor expresses this, you are not supposed to engage in “political” issues as an expert: But regarding drinking water, then it is controlled by the EU, because we have a directive on drinking water. And for instance with the pesticides, or the pesticides limit values, we [expert colleagues] do not discuss them, because they are decided politically. (Niels) Needless to say, such maintenance of the research–politics boundary amongst “bureaucrat” experts is reinforced by strong institutional commitments.25 As a corollary, “bureaucrat” experts imagine experts as at least ideally, if not always practically, value-free. In contrast, the more “partisan” expert respondents explicitly challenge this sharp boundary drawing, pointing instead to a more diffuse intertwining of knowledge and values.As trained biologists, both self-consciously comment on the importance of their disciplinary socialization and make telling references to the shaping influence of Rachel Carson’s Silent Spring back in the early 1960s.As the academic researcher relates, regulating pesticides is based to some extent on science, but at a certain point it becomes a political question of risk acceptance, where you must rely on “sophisticated viewpoints.” Even more fundamentally, the NGO expert talks about values as deeply embedded in experts’ disciplines and languages: So, toxicologists they write, this is really something I notice as a biologist, they write that tumors are benign. Then I say, you cannot write that. I mean, you can write that they are not malignant, but no one benefits from having a tumor. (Bente) From her perspective, biologists are more sensitive to complexities, uncertainties and a holistic way of thinking than are toxicologists. In other words, to the partisan experts, imagined experts may differ fundamentally, not just in cognitive terms, but in the entire intertwined knowledge–value perspective. Overall, while all our expert respondents acknowledge the existence of differences in risk perceptions in-between experts themselves, the consequences drawn differ between the groups. To the extent that more “bureaucratic” experts talk about such differences, they are mostly ascribed to purely “cognitive” differences stemming from differences in specialization. In some cases, differences may be seen in more “political” terms, as when several “bureaucratic” experts distance themselves from more “environmentally committed” colleagues. Characteristically, however, these various “bureaucrat” positions do not really lead to any criticism of current lay–expert relationships in regard to pesticide regulation. In contrast, imagining experts as intricately value-committed leads both “partisan” experts to express certain criticisms of this relationship. Thus, to the NGO expert, the closed nature of the regulatory system, with no public scrutiny, is a serious concern. Likewise, according to the academic researcher, publicly signaling the institutional affiliation of experts would help lay-people in navigating experts’ value-commitments. Summing up: deficit and non-deficit lay-people Summarizing this discussion on experts’ imagined lay-people, one can talk about a “dominant” model that is supplemented by two different, less common, positions. These altogether three versions of experts’imagined lay-people are meant to catch some of the discussed ambiguities in expert discourses. First of all, there is a distinction between “purely imagined” and “actually experienced” lay-people that re-emerges in almost all expert interviews. While the Blok et al.: Social identities and MSK 199 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 199 “purely imagined” model, closely resembling the “deficit” position in PUS, may be called “dominant,” most experts at the same time ascribe more cognitive and “value-based” competence to lay-people with whom they have some experience in interacting. Furthermore, what may be called a “good sense lay” position emerges in the, admittedly rare, instances in which experts see lay-people as carriers of well-founded criticism of what is then seen as an “illogical” regulatory system. Turning next to the two outlined expert cultures of lay–expert relations, the main point we want to convey is the overall difference between the imagined experts of “bureaucrat” and “partisan” experts when it comes to the value dimension. Whereas more “bureaucratic” experts strive for value-freedom, and hence see expert “political” commitments as signs of illegitimate boundary crossing, more “partisan” experts see knowledge and values as intimately linked. Consequently, being a “sophisticated” imagined expert to “partisans” means to strike a reasonable knowledge–value relation. 6. Bridging the lay–expert discrepancy? Having outlined the main characteristics of lay and expert respondents’ mental models of “experts” and “lay-people,” respectively, we now return to the issue of a lay–expert discrepancy. In this context, the term signifies at least two different, but interrelated, perspectives. Most intuitively, it refers to differences in risk perception between lay-people and experts. As has been briefly noted throughout, our study points in the familiar direction: overall, most lay-people perceive greater levels of risk from pesticides than do most experts. Secondly, and more important in this context, the lay–expert discrepancy refers to the level of social identities, or in other words how lay-people and experts are imagined differently amongst members of each group. We focus here on this second perspective, but the analytical assumption is that the two senses of lay–expert discrepancy are interconnected. Thus, for instance, imagining experts as more or less trustworthy clearly matters to how lay-people perceive risks from pesticides. In this section, we compare lay and expert versions of imagined lay-people and imagined experts directly, in order to gauge the extent of a lay–expert discrepancy at the level of social identities. Lay ignorance, values and risk judgments Starting with imagined lay-people, one can point to an important, if somewhat superficial, degree of convergence between lay and expert respondents on the issue of lay “ignorance” of relevant scientific risk knowledge. Generally speaking, lay respondents express an awareness of the fundamental inequalities in cognitive abilities separating them from experts, and thus acknowledge their need for, and dependency on, experts. However, lay-people and experts seem to differ widely on the implications to be drawn from the situation. First of all, most experts seem to assume that, lacking in pesticides risk “knowledge,” lay-people are unable to form independent, reasonable opinions on the subject. Instead, experts mainly imagine laypeople as highly influenced or even “manipulated” by the mass media. To the lay respondents, however, lacking scientific knowledge does not entail an inability to form reasoned opinions. Scientific aspects are clearly only part of the picture in lay-people’s risk perceptions, while to experts, possessing science-based evidence is crucial. Indeed, amongst the more “fatalistic” lay-people, intuitive forms of risk reasoning, based on experience and senses, are used to question the basis for expert authority. Overall, possession of scientific knowledge emerges as much less central to “real” lay-people, when compared to the all-importance ascribed to lay “ignorance” in experts’ imagined lay-people (cf. Lidskog, 1996: 41f.). 200 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 200 Similarly, “real” lay-people in our study do not resemble to any great extent the excessively “emotional,” scared or anxious actors emerging from experts’ reliance on the “deficit” model of lay-people. When lay-people worry about or take seriously the risks from pesticide usage, the kinds of rationales they provide are not, strictly speaking, emotional. Rather, they mix together, as a matter of course, certain scientific issues with broader social, ecological and moral commitments. Lay-people possess a much broader understanding of “risks” than do experts (e.g. Slovic, 1999).26 But value judgments need not be purely “irrational” or “emotional.” In the “mixed” lay–expert focus group interview, this point is raised reflexively by one of the lay-people: I think it is obvious here that when you are a scientist, you cannot jump to hasty conclusions. Then you need to research things and make experiments and test if things are really what they seem. But as lay-people, we are thankfully free to both think and believe and feel what we do, right. … So, as a lay person, I am allowed to think: well, and in 30 years, this [pesticides] will have an impact on me as a human being. (Lotte) What is noteworthy is that a strong self-identity as “lay” is given a positive interpretation, in terms of overcoming some perceived limitations in experts’ risk reasoning. Importantly, beyond the “deficit” model, “real” lay respondents seem to resemble much more some of the less dominant models of experts’ imagined lay-people, such as the “actually experienced lay” and “good sense lay.”27 However, experts seem unable or unwilling to ascribe such enhanced competencies to “general lay-people.” One particularly striking illustration of this emerges in the same focus group interview, where “imagined” and “real” laypeople can be said to meet. Here, Anita as expert and Lotte as lay actor negotiate the identity of the “general lay-people”: Anita: Well, there may be some few interested, like you maybe, right, who have an opinion towards wanting to buy organic. But I do not think, that the main part of the general population, which is primarily interested in what they put in their mouth. I do not think they will read it, if I put something in the newspaper from an expert perspective. Lotte: But in some way we are here today as members of the general population? Anita: Yes but, but I think … To me at least it sounds as if you are all more interested than perhaps quite a few others, right [laugh]. Lotte: The people I know, right, my friends and my family, they are interested in this in the same way as I am. They really are. Paul: Yes, that’s right. The expert in this example belongs to the “bureaucratic” group, and as previously argued, experts’ imagined lay-people are largely functional constructs of their institutional setting. As such, this and other lay–expert negotiations point to the limitations inherent in these institutionalized ways of dealing with lay actors (cf. Wynne, 2005). Hence, they seem to challenge experts to revise somewhat their models of imagined lay-people. Expert roles, institutions and disagreements Moving on to the level of imagined experts, we may once again start by noting a superficial similarity between lay and expert respondents, in that both groups tend to acknowledge some degree of “cognitive” limitations and uncertainties in experts’ knowledge. However, the degree of uncertainties and the consequences drawn from this situation seem to differ widely between the groups. Overall, as might be expected from studies on experts’ sense of control over risks Blok et al.: Social identities and MSK 201 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 201 (e.g. Sjöberg, 1998), lay-people tend to imagine expert knowledge as more uncertain, limited and partial than do experts themselves. Several lay-people point to the in-built limitations in expert knowledge, such as the inability to predict long-term consequences, while “bureaucrat” experts in particular are less prone to comment on this. Most significantly, however, most laypeople and most experts differ in regard to the implications of experts’ “cognitive” uncertainties. Put simply, lay-people tend to imagine limitations in experts’ knowledge as part of a wider “risky” social context, where choices concerning pesticides must be made under conditions of uncertainty. In contrast, while acknowledging uncertainties at a principal level, “bureaucrat” experts still assume expert knowledge to be sufficiently certain to generate “safe” outcomes. In interpreting these differences in how lay-people and experts model their imagined experts, our findings suggest a common underlying pattern. Thus, while most expert respondents tend to focus on the content of expert knowledge, its validity, and the research methods employed, most lay respondents focus instead on the institutional dimensions of expert-based “regulatory science” systems. This overall pattern is consistent with Brian Wynne’s studies on lay criteria for judging scientific claims (1996b: 38). Accordingly, the justification for experts’ political influence seems somewhat different between lay-people and experts. While for experts this justification is clearly related to their possession of “objective” risk assessment competencies, to lay-people it is much more strongly related to the task of maintaining acceptable risk levels on behalf of citizens. Again, the institutional dimension of expertise is here highlighted. Indeed, as has been argued by Wynne (1991: 120), what is often taken to be public misunderstanding of science, in the sense of cognitive content, may in fact mask a public understanding of science, in its institutional dimensions. Concomitantly, experts, in particular in “bureaucratic” positions, may themselves show limited ability, or opportunity, to reflect on their own institutional commitments. Imagining sensible lay–expert relations? In order to summarize these discussions on imagined lay-people and experts, it proves important to keep in mind some more general similarities and differences in how both groups view the lay–expert relation. Thus, as previously mentioned, most experts clearly experience a need for some form of “public understanding,” connected to their need for social legitimacy and acceptance of risk decisions. Likewise, and importantly, most lay respondents actually do express a need for trustworthy, knowledgeable experts, even if “fatalistic” lay-people experience this more as negative dependency.28 As such, our study does not confirm ideas of a generalized mistrust in experts amongst lay-people (cf. Lidskog, 1996).29 Accordingly, neither experts nor lay-people in our study generally express any strong visions for changing the decision-making system concerning pesticides in terms of its expert reliance. Amongst experts, however, the more “partisan” respondents do raise some criticisms, pointing towards the closed, “elitist” and non-transparent expert-based system, thus implicitly calling for improved lay–expert dialogue. Similarly, to the extent that lay-people voice ideas on how to improve lay–expert relations, it usually involves the idea that dialogue should be strengthened amongst the relevant stakeholders in risk affairs. In this sense, most lay respondents clearly see experts as part of political negotiations, which ought to be based on mutual dialogue. Improving such lay–expert dialogue, as some experts and lay-people suggest, hinges in our analysis on a better understanding of the mutual social identities of these groups. In our analysis, we have distinguished four such groups: “pro-scientific” and “fatalist” lay-people and “bureaucrat” and “partisan” experts. We end our discussion by comparing their various imagined lay-people and experts, to illustrate some barriers and potentials for strengthening lay–expert dialogue. 202 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 202 Significant implications can be drawn from this juxtaposition of lay and expert models of imagined lay-people and experts. Firstly, concerning imagined lay-people, it is significant to note how experts’ “purely imagined” lay-people resemble the self-understanding of neither “fatalist” nor “pro-science” lay publics (Figure 2). Compared to the self-conception of “fatalist” lay-people, experts’ concern about the excessive emotionality of lay-people seems exaggerated, as this group emerges as much less inclined to trusting expert warnings anyhow. As one respondent put it: they want to see it before believing it. Concerning the “pro-scientific” lay group, the experts’ “deficit” model risks creating a sense of lay alienation, as they feel themselves both more cognitively capable and more genuinely interested. Generally speaking, the mixed lay–expert focus group proved a vivid illustration of this point. Judging from the quality of these discussions, face-to-face interaction between laypeople and experts is both productive and desirable, while also likely to highlight the incommensurable experiences and knowledge of the two groups.30 To the experts, operating on the models of “actually experienced,” and more competent, lay-people seems here a more sensible starting point for dialogue. Turning to the character of imagined experts, we might start by noting how the “bureaucrat” expert ideal seems somewhat at odds with lay-people’s mental models (Figure 3). Compared even to the most trustworthy of lay respondents’ imagined experts, this ideal selfconception of “bureaucrat” experts ends up overestimating the cognitive authority and underestimating the value commitments of experts. As such, operating on this expert model might risk creating a sense of “expert arrogance” amongst lay-people, leading to feelings of mistrust. This would presumably be the case, for instance, if regulatory experts fail to acknowledge long-term uncertainties in their knowledge claims. In this sense, the “partisan” experts’ self-conception in many ways looks more compatible with lay-people’s sense of social and value commitments embedded in expert knowledge.31 As for the “fatalist” lay model of imagined experts, little basis for improving lay–expert relations is evident. Persistent expert disagreements have led to a deep-seated mistrust. What is noticeable, however, is that this “fatalist” view is premised on seeing experts as one homogeneous social group. In so far as one may speak here of an institutional “misunderstanding of science,” expert institutions may find it beneficial to publicly communicate “sensible” reasons for heterogeneity of expert Blok et al.: Social identities and MSK 203 knowledge “Actually experienced lay” “Fatalist lay” “Pro-science lay” “Purely imagined lay” + Values Figure 2. Matrix of imagined lay-people, lay and expert models. 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 203 views. Thus, as evidenced in the “pro-scientific” lay interviews, assuming experts to work on different assumptions and perspectives may lead lay-people to more overall trust in expert knowledge. 7. Conclusions In this paper, we have looked empirically and in some detail at the concepts of “expert” and “lay-people,” as they emerge in lay and expert discourses on a particular health and environmental risk issue: agricultural and home-use pesticides. Starting from the often-noted lay–expert discrepancy on environmentally related risks, we explore how and in what ways this discrepancy is predicated on different conceptions of these social categories. “Expert” and “lay-people” are essentially contested concepts, evolving through negotiations in specific social contexts. Drawing on concepts and insights from the “sociology of scientific knowledge” (SSK), we show how this level of social identities may well provide a key to understanding the lay–expert discrepancy in risk perception and communication (Wynne, 1996b). To do this, we develop an analytical matrix, functioning as a heuristic tool for analyzing actors’ mental models or “imaginings” of lay-people and experts. The matrix distinguishes between “knowledge” and “values” as two fundamental attributes of imagined lay-people and experts, viewing both as continua. Notably, the “classical” model of lay–expert relations sees experts as possessors of all knowledge relevant to the issue, while at the same time “unbiased” by values. Correspondingly, lay-people are seen as cognitively incompetent and excessively emotional or “value-biased.” We refer to this model as the “PUS expert”–“deficit lay” relationship. Overall, our analyses largely serve to question this clear-cut PUS lay–expert relationship model. At one level, this is certainly no new endeavor. From within the SSK community, authors such as Wynne, Irwin and Michael have persistently pointed to shortcomings and questionable normative commitments embedded in this model (Irwin and Wynne, 1996). What is novel in our analysis, however, is the attempt to approach these questions empirically, 204 Public Understanding of Science 17 (2) “Bureaucrat: ideal” Pro-science (government) expert” – values Fatalist expert + Knowledge “Partisan: Sophisticated” Figure 3. Matrix of imagined lay-people, lay and expert models. 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 204 and to compare lay and expert discourses directly, in a methodologically “equal” approach using qualitative interviews. Put briefly, our findings point to certain “dominant” patterns of imagined lay-people and experts, which broadly confirm assertions drawn from previous SSK studies. Thus, at the level of experts’ imagined lay-people, what we term the “deficit model,” following Brian Wynne (e.g. Wynne, 1992, 1996b) is clearly present in experts’ discourses. Lay-people are assumed to know little about chemicals in general, to act “irrationally” in risk matters, and to lack overall cognitive competencies. Likewise, they are assumed to be excessively emotional and susceptible to media “scares.” At the level of lay-people’s imagined experts, our findings confirm the assertion that the institutional affiliations of experts matter greatly to their trustworthiness amongst citizens (e.g. Wynne, 1996b). Generally, governmental experts would be seen as more trustworthy than both NGO and industry experts. Likewise, as stressed for instance by Beck (1992) and Lidskog (1996), experiences of persistent expert disagreements make public trust at best variable and conditional. Importantly, however, our findings also point to various inconsistencies and internal variations in expert views, challenging the sometimes “monolithic” perspective on expert knowledge in Wynne’s account. Thus, while the “deficit” view is somewhat dominant in expert discourses, various alternative models also emerged. In general, experts’ experiences in actually interacting with various institutionally relevant sub-publics usually lead to more positive views on lay competencies. In bounded cases, experts even imagine specific lay actors as competent to pass sensible judgments on “irrational” aspects of the expert-based regulatory system. Additionally, experts are no homogeneous group. While “bureaucrat” experts employed within the regulatory system draw a sharp boundary between research and politics, “partisan” experts within academia and NGOs are more inclined to acknowledge values embedded in scientific risk knowledge. This presumably makes “partisan” experts more congruent with lay views on experts, seen generally as necessarily part of a value-laden, regulatory system. Similarly, on the lay side, two main groups emerge, distinguished mainly by their level of trust in experts. While “pro-scientific” lay-people generally express conditional trust in, and a positive need for, expert knowledge, more “fatalist” lay-people express limited levels of trust, combined with a negative sense of dependency on experts. Within the more “pro-scientific” group, imagining experts as heterogeneous, working from different assumptions and perspectives, makes their views more congruent with experts’ self-conceptions. Overall, our conclusions are less pessimistic than those of Brian Wynne and others, whose analytic approach tends to depict experts as insufficiently reflexive and hence “sociologically deficient” (e.g. Wynne, 2005; Michael and Brown, 2000).32 Contrary to this, we take our study to suggest that most lay-people and most experts in Denmark, and most likely in other late-modern countries as well, possess the necessary resources, in the form of reflexive identities, to engage in mutually beneficial dialogues on risk issues. While more research is needed to ascertain the generalizability of our findings, we do claim to have described widely dispersed patterns of lay–expert relations at least in the Danish context. Certain of these patterns, including differences between “pro-scientific” and “fatalist” lay-people and “bureaucrat” and “partisan” experts, are likely to resemble those of other late-modern, “Western” risk societies. However, we would hesitate to make too strong claims of cross-national validity, seeing that patterns of lay–expert relations, or “civic epistemology,” need to be studied in context (cf. Jasanoff, 2005). Finally, claiming the existence of reflexive identities is not to claim their institutionalization; whether or not expert-based regulatory institutions will use these resources, by engaging in public dialogues, remains an altogether open question. Blok et al.: Social identities and MSK 205 04-070176-Blok.qxd 2/28/2008 8:00 PM Page 205 Notes 1 The background for this discussion is an on-going research project on lay and expert perceptions of risks from pesticide usage, conducted in Denmark on behalf of the environmental protection agency. 2 Focus group interviews were conducted with two different compositions of lay-people and experts. One focus group consisted of six lay-people, the other consisted of three lay-people and three experts. Later on in this paper, we will refer in particular to the latter “mixed” focus group interview. 3 The concept of “imagined lay-people” is borrowed from Maranta et al. (2003), who talk about “imagined lay persons” in connection to public displays of science. They define “imagined lay persons” as “conceptions of lay persons as they are manifested in the products and actions of … experts” (p. 151). See Section 3 of this paper for further clarification of these concepts. 4 As argued by Tutton et al., choice of methods, such as using focus groups for lay-people and individual interviews for experts, may itself reinforce an implicit hierarchy of views (2005: 103ff.). 5 As has been pointed out by Hansen (2005: 195), debates on the “public understanding of science” have featured most forcefully in the British context, compared with the German and the Danish contexts. However, aspects of these debates have been taken up beyond the UK, including in Denmark. 6 As Elam and Bertilsson point out (2002: 13ff.), over time PUS has gradually modified itself, moving closer to a position of “experimenting” with new expert–lay dialogue forums. As we employ the concepts, they refer to the “classical” PUS tradition. 7 Within the arena of risk research, for instance, the so-called psychometric paradigm has long explained the lay–expert discrepancy in risk perception by referring to “biases” in lay-people’s risk reasoning (e.g. Slovic, 1987). It should be noted that, in the 1990s, the psychometric paradigm increasingly expanded into studies of biases in experts’ risk perception as well (e.g. Slovic, 1999). 8 Irwin and Michael focus their analysis of this new science-policy agenda mainly on the UK and the European Union. However, the tendencies they describe would seem to be visible in most European contexts. 9 As evidenced for instance by the participatory approach to technology assessment in the Danish Board of Technology (Hansen, 2005: 155ff.). 10 Similarly, a recent Danish governmental think tank on public understanding of science emphasized the need for two-way “dialogue on science” (Horst, 2004). 11 Thus, in the SSK tradition, new identity positions such as the “lay expert,” i.e. citizens gaining recognition in expert-dominated arenas, have been identified and analyzed (e.g. Epstein, 1995; Callon, 1999). 12 As Pellizzoni (1999: 103) points out, most recent sociological analyses of science-based environmental management, including Robert Dahl’s, Funtowicz and Ravetz’ and “reflexive modernization” theory, point to interconnections between knowledge claims and value commitments in this field. By distinguishing analytically between knowledge and values, we do not want to claim their “real-world” separation. On the contrary, the deployed analytical matrix opens up the space of possible interconnections between the two. 13 We return to the issue of the implications of national context in the conclusion. For now, we need to make one small linguistic note: throughout the analysis, we will use the terms “lay respondents” and “lay-people” (and “expert respondents” and “experts”) more or less interchangeably. In other words, “lay-people” is shorthand for “lay-people in our study.” This is mainly a matter of presentational style, and should not be seen as an attempt to claim to fully represent these “real-world” categories. Still, while this precaution applies, we do of course claim to represent broad patterns of thinking actually found within these groups. 14 One significant event in this regard was a much-publicized “moral panic” in 1994, caused by the suspicion of a possible connection between local pesticide-related groundwater contamination and cases of human spinal disorder in the village of Ejstrupholm. 15 Although these positions are to be understood partly as “ideal types,” they still refer to and emerge from specific lay-people interview patterns. As such, we consider five of our respondents as broadly exponents of the “proscience” position, while three respondents generally embody the “fatalist” position. 16 As pointed out by several analysts, the role of experts’ disagreement in the creation of risk anxieties is a major part of the “reflexive modernization” theories of Giddens and of Beck (Wynne, 1996a; Yearley, 2000). Our analysis below to some extent confirms their assertions at the general level, but it also points to inaccuracies and exaggerations in their diagnosis. Space, however, precludes any detailed discussion of these issues here. 17 This is true of a now retired farmer, who expressed trust instead towards a former personal agricultural consultant to the family business. This is clearly a case of a “local” expert, whom you know on a personal level, as distinct from mass mediated experts (cf. Irwin et al., 1999). 18 Mike Michael (1996: 113) dubs this phenomenon “discourses of ignorance.” 19 “Fatalistic” lay-people rely to a large extent on what might be termed “intuitive toxicology” as a form of risk reasoning. When commenting on risks, they often refer to concrete, sensory-based experiences: pesticides smell badly, the water turns brown, local fish are dying, or tomatoes stay fresh for excessive periods of time, implying some 206 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:01 PM Page 206 chemical manipulation. Such intuitions often serve to question expert claims to knowledge. For instance, if you cannot taste the difference between conventional and organic carrots, then why pay extra for the organic? 20 One analytical reference point for this methodology is the concept of regulatory science; see Blok et al. (2006) for an elaboration of this perspective on pesticide expertise. 21 Turner (2001, 2003) argues that one may distinguish different types of experts by looking at their various constitutive publics. Although we do not follow his proposed distinctions and limit ourselves to scientifically accredited experts, our empirical argument here is quite similar. 22 Danish water supply policies entail a commitment to maintaining a very strict level of non-pollution from chemicals such as pesticides in groundwater reserves, used as drinking water with only minimal forms of prior purification. 23 We borrow these terms from Cultural Theory, in particular as this paradigm has been developed by Swedish sociologist Göran Sundqvist. In his theory of expertise, the bureaucrat expert corresponds to the high hierarchy, high integration position in the grid-group framework. The partisan expert corresponds to the low hierarchy, high integration position (Sundqvist, 1992: 155). In our context, we give the concepts a slightly different meaning, but we nonetheless want to retain these associations. 24 Specifically, we take this to include the following: the environmental protection agency, the agrochemical industry, the governmental researchers, the agricultural consultant, and the water supply agency. 25 As seen, for instance, in the strict separation of “risk assessment” from “risk management,” a point stressed by several experts occupying regulatory science positions. From an SSK perspective, such “boundary maintenance” has been exposed as somewhat fragile and problematic (e.g. Bal and Halffman, 1998). 26 Intermingling knowledge and values contrasts sharply with experts’ self-understandings, as “bureaucratic” experts in particular routinely draw sharp boundaries between knowledge and value judgments. This may be one reason for experts’ general failure to grasp lay forms of risk reasoning. 27 This is hardly surprising, given that experts’ real experiences in interacting with specific sub-groups of laypeople form the basis for these mental models. 28 Noticeably, however, dealing with risks from pesticides seems at the same time to be a much taken-for-granted aspect of lay-people’s everyday life, which experts are not imagined to basically be able to eradicate (cf. Lupton and Tulloch, 2002). 29 Assumptions about a generalized mistrust in experts can be said to form part of Ulrich Beck’s “risk society” thesis (1992). However, empirical research has generally failed to confirm this assumption. For recent Danish quantitative data, showing strong or some trust in environmental experts amongst the large majority of the population, see Kaae and Madsen (2003). 30 The three lay-people present during the mixed focus group interview all come closer to the “pro-scientific” idealtype than to the “fatalistic.” Presumably, however, more “fatalist” lay-people would likewise engage in sensible, if perhaps more critical, interaction with experts. 31 Still, however, “partisan” experts may likewise risk overestimating the cognitive authority of their public claims, as evidenced for instance in some lay-people’s skepticism towards environmental NGOs. 32 Michael and Brown refer to this as Wynne’s “lay political science,” pointing out how this in a sense “reverses,” rather than overcomes, the lay–expert discrepancy. References Bal, R. and Halffman, W., eds (1998) The Politics of Chemical Risk: Scenarios for a Regulatory Future. Dordrecht: Kluwer Academic Publishers. Beck, U. (1992) Risk Society. London: SAGE. Blok, A., Jensen, M. and Kaltoft, P. (2006) “Regulating Pesticide Risks in Denmark: Expert and Lay Perspectives,” Journal of Environmental Policy and Planning (forthcoming). Bourdieu, P. (1984) Distinction: A Social Critique of the Judgement of Taste. Cambridge, MA: Harvard University Press. Callon, M. (1999) “The Role of Lay People in the Production and Dissemination of Scientific Knowledge,” Science, Technology and Society 4(1): 81–94. Collins, H.M. and Evans, R. (2002) “The Third Wave of Science Studies: Studies of Expertise and Experience,” Social Studies of Science 32(2): 235–96. Coppin, D.M., Eisenhauer, B.W. and Krannich, R.S. (2002) “Is Pesticide Use Socially Acceptable? A Comparison between Urban and Rural Settings,” Social Science Quarterly 83(1): 379–94. Elam, M. and Bertilsson, M. (2002) “Consuming, Engaging and Confronting Science: The Emerging Dimensions of Scientific Citizenship,” STAGE Discussion Paper One, URL: http://www.stage-research.net (accessed July 2005). EPA (Miljøstyrelsen) (2004) Risikohåndtering og risikokommunikation [Risk Management and Risk Communication]. Miljøprojekt nr. 893. URL: www.mst.dk (in Danish). Blok et al.: Social identities and MSK 207 04-070176-Blok.qxd 2/28/2008 8:01 PM Page 207 Epstein, S. (1995) “The Construction of Lay Expertise: AIDS Activism and the Forging of Credibility in the Reform of Clinical Trials,” Science, Technology, & Human Values 20(4): 408–37. Hajer, M.A. (1995) The Politics of Environmental Discourse: Ecological Modernization and the Policy Process. Oxford: Oxford University Press. Hansen, J. (2005) “Framing the Public: Three Case Studies in Public Participation in Governance of Agricultural Biotechnology,” Doctoral thesis, European University Institute, Florence. Hansen, J., Holm, L., Frewer, L., Robinson, P. and Sandøe, P. (2003) “Beyond the Knowledge Deficit: Recent Research into Lay and Expert Attitudes to Food Risks,” Appetite 41: 111–21. Horst, M. (2003) “Controversy and Collectivity: Articulations of Social and Natural Order in Mass Mediated Representations of Biotechnology,” Ph.D. series no. 2003-28, Copenhagen Business School, Copenhagen. Horst, M. (2004) “Science Communication: a Critical Discussion of the Report from the Danish Think Tank on Public Understanding of Science,” Paper presented at the Copenhagen Business School Conference on Science Communication, 3 June, Copenhagen. Irwin, A. and Michael, M. (2003) Science, Social Theory & Public Knowledge. Maidenhead: Open University Press. Irwin,A. and Rothstein, H. (2003) “Regulatory Science in an International Regime,” in F. den Hond, P. Groenewegan and N.M. van Straalen (eds) Pesticides: Problems, Improvements, Alternatives, pp. 77–86. Oxford: Blackwell Science. Irwin, A. and Wynne, B. (1996) “Introduction,” in A. Irwin and B. Wynne (eds) Misunderstanding Science? The Public Reconstruction of Science and Technology, pp. 1–18. Cambridge: Cambridge University Press. Irwin, A., Dale, A. and Smith, D. (1996) “Science and Hell’s Kitchen: the Local Understanding of Hazard Issues,” in A. Irwin and B. Wynne (eds) Misunderstanding Science? The Public Reconstruction of Science and Technology, pp. 47–64. Cambridge: Cambridge University Press. Irwin, A., Simmons, P. and Walker, G. (1999) “Faulty Environments and Risk Reasoning: the Local Understanding of Industrial Hazards,” Environment and Planning A 31: 1311–26. Jasanoff, S. (2005) Designs on Nature: Science and Democracy in Europe and the United States. Princeton: Princeton University Press. Kaae, B.C. and Madsen, L.M. (2003) “Holdninger og ønsker til Danmarks natur” [Attitudes and Wishes for Nature in Denmark]. By-og Landsplanserien 21. Copenhagen: Skov & Landskab. Kraus, N., Malmfors, T. and Slovic, P. (1992) “Intuitive Toxicology: Expert and Lay Judgment of Chemical Risks,” Risk Analysis 12(2): 215–32. Lidskog, R. (1996) “In Science We Trust? On the Relation Between Scientific Knowledge, Risk Consciousness and Public Trust,” Acta Sociologica 39(1): 31–58. Lupton, D. and Tulloch, J. (2002) “‘Risk is Part of Your Life’: Risk Epistemologies among a Group of Australians,” Sociology 36(2): 317–34. Maranta, A., Guggenheim, M., Gisler, P. and Pohl, C. (2003) “The Reality of Experts and the Imagined Lay Person,” Acta Sociologica 46(2): 150–65. Michael, M. (1996) “Ignoring Science: Discourses of Ignorance in the Public Understanding of Science,” in A. Irwin and B. Wynne (eds) Misunderstanding Science? The Public Reconstruction of Science and Technology, pp. 107–25. Cambridge: Cambridge University Press. Michael, M. and Brown, N. (2000) “From the Representation of Publics to the Performance of ‘Lay Political Science,’” Social Epistemology 14(1): 3–19. Pellizzoni, L. (1999) “Reflexive Modernisation and Beyond: Knowledge and Value in the Politics of Environment and Technology,” Theory, Culture & Society 16(4): 99–125. Siegrist, M., Cvetkovich, G. and Roth, C. (2000) “Salient Value Similarity, Social Trust, and Risk/Benefit Perception,” Risk Analysis 20(3): 353–62. Sjöberg, L. (1998) “Risk Perception: Experts and the Public,” European Psychologist 3(1): 1–12. Slovic, P. (1987) “Perception of Risk,” Science 236(17 April): 280–5. Slovic, P. (1999) “Trust, Emotion, Sex, Politics, and Science: Surveying the Risk-Assessment Battlefield,” Risk Analysis 19(4): 689–701. Slovic, P., Malmfors, T., Krewski, D., Mertz, C.K., Neil, N. and Bartlett, S. (1995) “Intuitive Toxicology II: Expert and Lay Judgments of Chemical Risks in Canada,” Risk Analysis 15(6): 661–75. Sundqvist, G. (1992) “Vitenskapen och miljöproblemen: en expertsociologisk studie” [Science and the Environment: A Study in the Sociology of Expertise], Doctoral dissertation, Department of Sociology, University of Gothenburg, Gothenburg (in Swedish). Turner, S. (2001) “What is the Problem with Experts?,” Social Studies of Science 31(1): 123–49. Turner, S.P. (2003) Liberal Democracy 3.0: Civil Society in an Age of Experts. London: SAGE. Tutton, R., Kerr, A. and Cunningham-Burley, S. (2005) “Myriad Stories: Constructing Expertise and Citizenship in Discussions of the New Genetics,” in M. Leach, I. Scoones and B. Wynne (eds) Science and Citizens, pp. 101–12. London: Zed Books. 208 Public Understanding of Science 17 (2) 04-070176-Blok.qxd 2/28/2008 8:01 PM Page 208 Wynne, B. (1989) “Frameworks of Rationality in Risk Management: Towards the Testing of Naïve Sociology,” in J. Brown (ed.) Environmental Threats: Perception, Analysis and Management, pp. 33–47. London: Belhaven Press. Wynne, B. (1991) “Knowledge in Context,” Science, Technology, & Human Values 16: 111–21. Wynne, B. (1992) “Risk and Social Learning: Reification to Engagement,” in S. Krimsky and D. Golding (eds) Social Theories of Risk, pp. 275–300. Westport: Praeger. Wynne, B. (1996a) “May the Sheep Safely Graze? A Reflexive View of the Expert-Lay Knowledge Divide,” in S. Lash, B. Szerszynski and B. Wynne (eds) Risk, Environment & Modernity: Towards a New Ecology, pp. 47–83. London: SAGE. Wynne, B. (1996b) “Misunderstood Misunderstandings: Social Identities and Public Uptake of Science,” in A. Irwin and B. Wynne (eds) Misunderstanding Science? The Public Reconstruction of Science and Technology, pp. 19–46. Cambridge: Cambridge University Press. Wynne, B. (2005) “Risk as Globalizing ‘Democratic’ Discourse? Framing Subjects and Citizens,” in M. Leach, I. Scoones and B. Wynne (eds) Science and Citizens, pp. 66–82. London: Zed Books. Yearley, S. (2000) “Making Systematic Sense of Public Discontents with Expert Knowledge: Two Analytical Approaches and a Case Study,” Public Understanding of Science 9: 105–22. Blok et al.: Social identities and MSK 209 Authors Anders Blok is research assistant in Environmental Sociology at the Department of Policy Analysis, Danish Environmental Research Institute. His academic background is in sociology, focusing on political and environmental sociology as well as the sociology of scientific knowledge. His previous work has mainly been in the sociology of risk perceptions, environmental problems and expertise. He is a member of the steering committee of the Danish Sociological Association. Currently, he is a visiting postgraduate researcher at Tohoku University, Japan. Mette Jensen is senior researcher in Environmental Sociology within the Department of Policy Analysis, Danish Environmental Research Institute. Her main research themes are in the field of mobility, risk perception, modernity and environment. She is a member of the steering committee in a Danish social science environmental research network “Misonet,” a member of a Nordic sociological research network “Environment & Risk” under the Nordic Sociological Association and a member of a European plus an international sociological research network, both called “Environment & Society,” under the European Sociological Association (ESA) and the International Sociological Association (ISA) respectively. She teaches Environmental and Risk Sociology at the Department of Sociology, University of Copenhagen. Correspondence: Danish Environmental Research Institute, Department of Policy Analysis, Frederiksborggade 399, Box 358, 4000 Roskilde, Denmark; e-mail: mje@dmu.dk Pernille Kaltoft is senior researcher in Environmental Sociology within the Department of Policy Analysis, Danish Environmental Research Institute. Her main research themes are in the field of rural sociology, organic farming and risk perception. She is a member of a Danish social science environmental research network “Misonet,” and a member of the Scientific Advisory Board for the Research School for Organic Agriculture and Food Systems. She is teaching, primarily at the Royal Veterinary and Agricultural University in Denmark, but has also organized a Ph.D. course, “Modernization Processes in Organic Food Networks” (January 2004) within the Research School for Organic Agriculture and Food Systems. 04-070176-Blok.qxd 2/28/2008 8:01 PM Page 209 04-070176-Blok.qxd 2/28/2008 8:01 PM Page 210