Ethics in Social Research CHAPTER introduction Why Be Ethical? Scientific Misconduct Unethical but Legal Power Relations Ethical Issues Involving Research Participants Origins of Research Participant Protection Physical Harm, Psychological Abuse, and Legal Jeopardy Other Harm to Participants Deception Informed Consent Special Populations and New Inequalities Privacy, Anonymity, and Confidentiality Mandated Protections of Research Participants Ethics and the Scientific Community Ethics and the Sponsors of Research Whistle-Blowing Arriving at Particular Findings Suppressing Findings Concealing the True Sponsor Politics of Research Value-Free and Objective Research Conclusion 48 PART ONE / FOUNDATIONS INTRODUCTION Ethics include the concerns, dilemmas, and conflicts that arise over the proper way to conduct research. Ethics help to define what is or is not legitimate to do, or what "moral" research procedure involves. This is not as simple as it may appear, because there are few ethical absolutes and only agreed-upon broad principles. These principles require judgment to apply and some may conflict with others in practice. Many ethical issues ask you to balance two values: the pursuit of knowledge and the rights of research participants or of others in society. Social researchers balance potential benefits—such as advancing the understanding of social life, improving decision making, or helping research participants—against potential costs—such as loss of dignity, self-esteem, privacy, or democratic freedoms. Social researchers confront many ethical dilemmas and must decide how to act. They have a moral and professional obligation to be ethical, even if research participants are unaware of or unconcerned about ethics. Many areas of professional practice have ethical standards (e.g., journalists, police departments, business corporations, etc.), but the ethical standards for doing social research are often stricter. To do professional social research, you must both know the proper research techniques (e.g., sampling) and be sensitive to ethical concerns. This is not always easy. For centuries, moral, legal, and political philosophers debated the issues researchers regularly face. It is difficult to appreciate fully the ethical dilemmas experienced by researchers until you actually begin to do research, but waiting until the middle of a study is too late. You need to prepare yourself ahead of time and consider ethical concerns as you design a study so that you can build sound ethical practices into a study's design. In addition, by developing sensitivity to ethical issues, you will be alert to potential ethical concerns that can arise as you make decisions while conducting a study. Also, an ethical aware- ness will help you better understand the overall research process. Ethics begin and end with you, the individual social researcher. A strong personal moral code by the researcher is the best defense against unethical behavior. Before, during, and after conducting a study, a researcher has opportunities to, and should, reflect on the ethics of research actions and consult his or her conscience. Ultimately, ethical research depends on the integrity of an individual researcher. WHY BE ETHICAL? Given that most people who conduct social research are genuinely concerned about others, you might ask, Why would any researcher ever act in an ethically irresponsible manner? Most unethical behavior is due to a lack of awareness and pressures on researchers to take ethical shortcuts. Researchers face pressures to build a career, publish new findings, advance knowledge, gain prestige, impress family and friends, hold on to a job, and so forth. Ethical research will take longer to complete, cost more money, be more complicated, and be less likely to produce unambiguous results. Plus, there are many opportunities in research to act unethically, the odds of getting caught are small, and written ethical standards are in the form of vague, loose principles. The ethical researcher gets few rewards and wins no praise. The unethical researcher, if caught, faces public humiliation, a ruined career, and possible legal action. The best preparation for ethical behavior is to internalize a sensitivity to ethical concerns, to adopt a serious professional role, and to interact regularly with other researchers. Moreover, the scientific community demands ethical behavior without exceptions. Scientific Misconduct The research community and agencies that fund research oppose a type of unethical behavior CHAPTER 3 / ETHICS IN SOCIAL RESEARCH 49 tand the overall rou, the individ-personal moral t defense against iring, and after ■ has opportunity ethics of re-' her conscience. >ends on the in- nduct social re-d about others, researcher ever manner? Most ck of awareness to take ethical sures to build a idvance knowl-lily and friends, Ethical research st more money, ss likely to pro-there are many unethically, the ill, and written l of vague, loose ew rewards and 1 researcher, if a ruined career, ■est preparation lize a sensitivity serious profes-larly with other :ific community ut exceptions. encies that fund thical behavior called scientific misconduct; it includes research fraud and plagiarism. Scientific misconduct occurs when a researcher falsifies or distorts the data or the methods of data collection, or plagiarizes the work of others. It also includes significant, unjustified departures from the generally accepted scientific practices for doing and reporting on research. Research fraud occurs when a researcher fakes or invents data that he or she did not really collect, or fails to honestly and fully report how he or she conducted a study. Although rare, it is considered a very serious violation. The most famous case of research fraud was that of Sir Cyril Burt, the father of British educational psychology. Burt died in 1971 as an esteemed researcher who was famous for his studies with twins that showed a genetic basis of intelligence. In 1976, it was discovered that he had falsified data and the names of coauthors. Unfortunately, the scientific community had been misled for nearly 30 years. More recently, a social psychologist was discovered to have fabricated data for several experiments on sex bias conducted at Harvard University in the 1990s. Plagiarism occurs when a researcher "steals" the ideas or writings of another or uses them without citing the source. Plagiarism also- includes stealing the work of another researcher, an assistant, or a student, and misrepresenting it as one's own. These are serious breaches of ethical standards.1 Unethical but Legal Behavior may be unethical but legal (i.e., not break any law). A plagiarism case illustrates the distinction between legal and ethical behavior. The American Sociological Association documented that a 1988 book without any footnotes by a dean from Eastern New Mexico University contained large sections of a 1978 dissertation that a sociology professor at Tufts University wrote. Copying the dissertation was not illegal; it did not violate copyright law because the sociologist's dissertation did not have a copyright filed with the U.S. government. Nevertheless, it was FIGURE 3.1 Typology of Legal and Moral Actions in Social Research LEGAL ETHICAL Yes No Yes Moral and Legal Legal but Immoral No Illegal but Moral Immoral and Illegal clearly unethical according to standards of professional behavior.2 (See Figure 3.1 for relations between legal and moral actions.) POWER RELATIONS A professional researcher and the research participants or employee-assistants are in a relationship of unequal power and trust. An experimenter, survey director, or research investigator has power over participants and assistants, and in turn, they trust his or her judgment and authority. The researcher's credentials, training, professional role, and the place of science in modern society legitimate the power and make it into a form of expert authority. Some ethical issues involve an abuse of power and trust. A researcher's authority to conduct social research and to earn the trust of others is accompanied always by an unyielding ethical responsibility to guide, protect, and oversee the interests of the people being studied. When looking for ethical guidance, researchers are not alone. They can turn to a number of resources: professional colleagues, ethical advisory committees, institutional review boards or human subjects committees at a college or institution (discussed later), codes of ethics by professional associations (discussed later in this chapter), and writings on ethics in research. The larger research community firmly supports and upholds ethical behavior, even if an individual 50 PART ONE / FOUNDATIONS : searcher is ultimately responsible to do what is ethical in specific situations. ETHICAL ISSUES INVOLVING RESEARCH PARTICIPANTS Have you ever been a participant in a research study? If so, how were you treated? More attention is focused on the possible negative effects of research on those being studied than any other ethical issue, beginning with concerns about biomedical research. Acting ethically requires that a researcher balance the value of advancing knowledge against the value of noninterference in the lives of others. Either extreme causes problems. Giving research participants absolute rights of noninterference could make empirical research impossible, but giving researchers absolute rights of inquiry could nullify participants' basic human rights. The moral question : ecomes: When, if ever, are researchers justified r. risking physical harm or injury to those being ;r_died, causing them great embarrassment or r: jnvenience, violating their privacy, or fright-: nir.g them? The law and codes of ethics recognize some - prohibitions: Never cause unnecessary or reversible harm to subjects; secure prior vol-" - ry consent when possible; and never unnec-sssarify humiliate, degrade, or release harmful issc rmation about specific individuals that was collected for research purposes. In other words, job should always show respect for the research rant. These arc minimal standards and at subject to interpretation (e.g., What does iry mean in a specific situation?). Origins of Research Participant Protection jOKem over the treatment of research partici-foocs arose after the revelation of gross viola- isBS of basic human rights in the name of science. The most notorious violations were experiments" conducted on Jews and others in Nazi Germany, and similar "medical experiments" to test biological weapons by Japan in the 1940s. In these experiments, terrible tortures were committed. For example, people were placed in freezing water to see how long it took them to die, people were purposely starved to death, people were intentionally infected with horrible diseases, and limbs were severed from children and transplanted onto others.3 Such human rights violations did not occur only long ago. In a famous case of unethical research, the Tuskegee Syphilis Study, also known as Bad Blood, the President of the United States admitted wrongdoing and formally apologized in 1997 to the participant-victims. Until the 1970s, when a newspaper report caused a scandal to erupt, the U.S. Public Health Service sponsored a study in which poor, uneducated African American men in Alabama suffered and died of untreated syphilis, while researchers studied the severe physical disabilities that appear in advanced stages of the disease. The unethical study began in 1929, before penicillin was available to treat the disease, but it continued long after treatment was available. Despite their unethical treatment of the people, the researchers were able to publish their results for 40 years. The study ended in 1972, but a formal apology took another 25 years.4 Unfortunately, the Bad Blood scandal is not unique. During the Cold War era, the U.S. government periodically compromised ethical research principles for military and political goals. In 1995, reports revealed that the government authorized injecting unknowing people with radioactive material in the late 1940s. In the 1950s, the government warned Eastman Kodak and other film manufacturers about nuclear fallout from atomic tests to prevent fogged film, but it did not warn nearby citizens of health hazards. In the 1960s, the U.S. army gave unsuspecting soldiers LSD (a hallucinogenic drug), causing serious trauma. Today, researchers widely recognize these to be violations of two fundamental ethical principles: Avoid physical harm and obtain informed consent.3 CHAPTER 3 / ETHICS IN SOCIAL RESEARCH 51 Physical Harm, Psychological Abuse, and Legal Jeopardy Social research can harm a research participant in several ways: physical, psychological, and legal harm, as well as harm to a person's career, reputation, or income. Different types of harm are more likely in other types of research (e.g., in experiments versus field research). It is a researcher's responsibility to be aware of all types of potential harm and to take specific actions to minimize the risk to participants at all times. Physical Harm. Physical harm is rare. Even in biomedical research, where the intervention into a person's life is much greater, 3 to 5 percent of studies involved any person who suffered any harm.6 A straightforward ethical principle is that researchers should never cause physical harm. An ethical researcher anticipates risks before beginning a study, including basic safety concerns (e.g., safe buildings, furniture, and equipment). This means that he or she screens out high-risk subjects (those with heart conditions, mental breakdown, seizures, etc.) if great stress is involved and anticipates possible sources of injury or physical attacks on research participants or assistants. The researcher accepts moral and legal responsibility for injury due to participation in research and terminates a project immediately if he or she can no longer fully guarantee the physical safety of the people involved (see the Zimbardo study in Box 3.1). Psychological Abuse, Stress, or Loss of Self-Esteem. The risk of physical harm is rare, but social researchers can place people in highly stressful, embarrassing, anxiety-producing, or unpleasant situations. Researchers want to learn about people's responses in real-life, high-anxiety-producing situations, so they might place people in realistic situations of psychological discomfort or stress. Is it unethical to cause discomfort? The ethics of the famous Milgram obedience study are still debated (see Box 3.1). Some say that the precautions taken and the knowledge gained outweighed the stress and po- tential psychological harm that research participants experienced. Others believe that the extreme stress and the risk of permanent harm were too great. Such an experiment could not be conducted today because of heightened sensitivity to the ethical issues involved. Social researchers have created high levels of anxiety or discomfort. They have exposed participants to gruesome photos; falsely told male students that they have strong feminine personality traits; falsely told students that they have failed; created a situation of high fear (e.g., smoke entering a room in which the door is locked); asked participants to harm others; placed people in situations where they face social pressure to deny their convictions; and had participants lie, cheat, or steal.7 Researchers who study helping behavior often place participants in emergency situations to see whether they will lend assistance. For example, Piliavin and associates (1969) studied helping behavior in subways by faking someone's collapse onto the floor. In the field experiment, the riders in the subway car were unaware of the experiment and did not volunteer to participate in it. The only researchers who might even consider conducting a study that purposely induces great stress or anxiety in research participants are very experienced and take all necessary precautions before inducing anxiety or discomfort. The researchers should consult with others who have conducted similar studies and mental health professionals as they plan the study. They should screen out high-risk populations (e.g., those with emotional problems or weak hearts), and arrange for emergency interventions or termination of the research if dangerous situations arise. They must always obtain written informed consent (to be discussed) before the research and debrief the people immediately afterward (i.e., explain any deception and what actually happened in the study). Researchers should never create unnecessary stress (i.e., beyond the minimal amount needed to create the desired effect) or stress that lacks a very clear, legitimate research purpose. Knowing what "minimal 52 PART ONE / FOUNDATIONS Box 3.1 Stanley Milgram's obedience study (Milgram, 1 963, 1 965, 1 974) attempted to discover how the horrors of the Holocaust under the Nazis could have occurred by examining the strength of social pressure to obey authority. After signing "informed consent forms," subjects were assigned, in rigged random selection, to be a "teacher" while a confederate was the "pupil." The teacher was to test the pupil's memory of word lists and increase the electric shock level if the pupil made mistakes. The pupil was located in a nearby room, so the teacher could hear but not see the pupil. The shock apparatus was clearly labeled with increasing voltage. As the pupil made mistakes and the teacher turned switches, she or he also made noises as if in severe pain. The researcher was present and made comments such as "You must go on" to the teacher. Milgram reported, "Subjects were observed to sweat, tremble, stutter, bite their lips, groan and dig their fingernails into their flesh. These were characteristic rather than exceptional responses to the experiment" (Milgram, 1 963:375). The percentage of subjects who would shock to dangerous levels was dramatically higher than expected. Ethical concerns arose over the use of deception and the extreme emotional stress experienced by subjects. In Laud Humphreys's (Humphreys, 1975) tearoom trade study (a study of male homosexual encounters in public restrooms), about 1 00 men were observed engaging in sexual acts as Humphreys pretended to be a "watchqueen" (a voyeur and lookout). Subjects were followed to their cars, and their license numbers were secretly recorded. Names and addresses were obtained from police registers when Humphreys posed as a market researcher. One year later, in disguise, Humphreys used a deceptive story about a health survey to interview the subjects in their homes. Humphreys was careful to keep names in safety deposit boxes, and identifiers with subject names were burned. He significantly advanced knowledge of homosexuals who frequent "tearooms" and overturned previous false beliefs about them. There has been controversy over the study: The subjects never consented; deception was used; and the names could have been used to blackmail subjects, to end marriages, or to initiate criminal prosecution. In the Zimbardo prison experiment (Zimbardo, 1 972, 1 973; Zimbardo et al., 1 973, 1 974), male students were divided into two role-playing groups: guards and prisoners. Before the experiment, volunteer students were given personality tests, and only those in the "normal" range were chosen. Volunteers signed up for two weeks, and prisoners were told that they would be under surveillance and would have some civil rights suspended, but that no physical abuse was allowed. In a simulated prison in the basement of a Stanford University building, prisoners were deindividualized (dressed in standard uniforms and called only by their numbers) and guards were militarized (with uniforms, nightsticks, and reflective sunglasses). Guards were told to maintain a reasonable degree of order and served 8-hour shifts, while prisoners were locked up 24 hours per day. Unexpectedly, the volunteers became too caught up in their roles. Prisoners became passive and disorganized, while guards became aggressive, arbitrary, and dehumanizing. By the sixth day, Zimbardo called off the experiment for ethical reasons. The risk of permanent psychological harm, and even physical harm, was too great. Three Cases of Ethical Controversy amount" means comes with experience. It is best to begin with too little stress, risking a finding of no effect, than to create too much. It is always wise to work in collaboration with other researchers when the risk to participants is high, because the involvement of several ethically sen- sitive researchers reduces the chances of making an ethical misjudgment. Research that induces great stress and anxiety in participants also carries the danger that experimenters will develop a callous or manipulative attitude toward others. Researchers CHAPTER 3 / ETHICS IN SOCIAL RESEARCH 53 have reported feeling guilt and regret after conducting experiments that caused psychological harm to people. Experiments that place subjects in anxiety-producing situations may produce significant personal discomfort for the ethical researcher. Legal Harm. A researcher is responsible for protecting research participants from increased risk of arrest. If participation in research increases the risk of arrest, few individuals will trust researchers or be willing to participate in future research. Potential legal harm is one criticism of Humphreys's 1975 tearoom trade study (see Box 3.1). A related ethical issue arises when a researcher learns of illegal activity when collecting data. A researcher must weigh the value of protecting the researcher-subject relationship and the benefits to future researchers against potential serious harm to innocent people. The researcher bears the cost of his or her judgment. For example, in his field research on police, Van Maanen (1982:114—115) reported seeing police beat people and witnessing illegal acts and irregular procedures, but said, "On and following these troublesome incidents I followed police custom: I kept my mouth shut." Field researchers in particular can face difficult ethical decisions. For example, when studying a mental institution, Taylor (1987) discovered the mistreatment and abuse of inmates by the staff. He had two choices: Abandon the study and call for an immediate investigation, or keep quiet and continue with the study for several months, publicize the findings afterwards, and then become an advocate to end the abuse. After weighing the situation, he followed the latter course and is now an activist for the rights of mental institution inmates. In some studies, observing illegal behavior may be central to the research project. If a researcher covertly observes and records illegal behavior, then supplies the information to law-enforcement authorities, he or she is violating ethical standards regarding research participants and is undermining future social research. At the same time, a researcher who fails to report illegal behavior is indirectly permitting criminal behavior. He or she could be charged as an accessory to a crime. Cooperation with law-enforcement officials raises the question, Is the researcher a professional scientist who protects research participants in the process of seeking knowledge, or a free-lance undercover informant who is really working for the police trying to "catch" criminals? Other Harm to Participants Research participants may face other types of harm. For example, a survey interview may create anxiety and discomfort if it asks people to recall unpleasant or traumatic events. An ethical researcher must be sensitive to any harm to participants, consider precautions, and weigh potential harm against potential benefits. Another type of harm is a negative impact on the careers, reputations, or incomes of research participants. For example, a researcher conducts a survey of employees and concludes that the supervisor's performance is poor. As a consequence, the supervisor loses her job. Or, a researcher studies homeless people living on the street. The findings show that many engage in petty illegal acts to get food. As a consequence, a city government "cracks down" on the petty illegal acts and the homeless people can no longer eat. What is the researcher's responsibility? The ethical researcher considers the consequences of research for those being studied. The general goal is not to cause any harm simply because someone was a research participant. However, there is no set answer to such questions. A researcher must evaluate each case, weigh potential harm against potential benefits, and bear the responsibility for the decision. Deception Has anyone ever told you a half-truth or lie to get you to do something? How did you feel about it? Social researchers follow the ethical principle of voluntary consent: Never force any- 54 PART ONE / FOUNDATIONS one to participate in research, and do not lie to anyone unless it is necessary and the only way to accomplish a legitimate research purpose. The people who participate in social research should explicitly agree to participate. A person's right not to participate can be a critical issue whenever the researcher uses deception, disguises the research, or uses covert research methods. Social researchers sometimes deceive or lie to participants in field and experimental research. A researcher might misrepresent his or her actions or true intentions for legitimate methodological reasons. For example, if participants knew the true purpose, they would modify their behavior, making it impossible to learn of their real behavior. Another situation occurs when access to a research site would be impossible if the researcher told the truth. Deception is never preferable if the researcher can accomplish the same thing without using deception. Experimental researchers often deceive subjects to prevent them from learning the hypothesis being tested and to reduce "reactive effects" (see Chapter 8). Deception is acceptable only if a researcher can show that it has a clear, specific methodological purpose, and even then, the researcher should use it only to the minimal degree necessary. Researchers who use deception should always obtain informed consent, never misrepresent risks, and always explain the actual conditions to participants afterwards. You might ask, How can a researcher obtain prior informed consent and still use deception? He or she can describe the basic procedures involved and conceal only specific information about hypotheses being tested. Sometimes field researchers use covert observation to gain entry to field research settings. In studies of cults, small extremist political sects, illegal or deviant behavior, or behavior in a large public area, it may be impossible to conduct research if a researcher announces and discloses her or his true purpose. If a covert stance is not essential, a researcher should not use it. If he or she does not know whether covert access is necessary, then a strategy of gradual disclosure may be best. When in doubt, it is best to err in the direction of disclosing one's true identity and purpose. Covert research remains controversial, and many researchers feel that all covert research is unethical. Even those who accept covert research as ethical in certain situations say that it should be used only when overt observation is impossible. Whenever possible, the researcher should inform participants of the observation immediately afterwards and give them an opportunity to express concerns. Deception and covert research may increase mistrust and cynicism as well as diminish public respect for social research. Misrepresentation in field research is analogous to being an undercover agent or government informer in nonde-mocratic societies. The use of deception has a long-term negative effect. It increases distrust among people who are frequently studied and makes doing social research more difficult in the long term. Informed Consent A fundamental ethical principle of social research is: Never coerce anyone into participating; participation must be voluntary at all times. Permission alone is not enough; people need to know what they are being asked to participate in so that they can make an informed decision. Participants can become aware of their rights and what they are getting involved in when they read and sign a statement giving informed consent— an agreement by participants stating they are willing to be in a study and they know something about what the research procedure will involve. Governments vary in the requirement for informed consent. The U.S. federal government does not require informed consent in all research involving human subjects. Nevertheless, researchers should get written informed consent unless there are good reasons for not obtaining it (e.g., covert field research, use of secondary data, etc.) as judged by an institutional review board (IRB) (see the later discussion of IRBs). CHAPTER 3 / ETHICS IN SOCIAL RESEARCH ss Informed consent statements provide specific information (see Box 3.2). A general statement about the kinds of procedures or questions involved and the uses of the data are sufficient for informed consent. Studies suggest that participants who receive a full informed consent statement do not respond differently from those who do not. If anything, people who refused to sign such a statement were more likely to guess or answer "no response" to questions. It is unethical to coerce people to participate, including offering them special benefits that they cannot otherwise attain. For example, it is unethical for a commanding officer to order a soldier to participate in a study, for a professor to require a student to be a research subject in order to pass a course, or for an employer to expect an employee to complete a survey as a con- Informed Consent Informed consent statements contain the following: 1. A brief description of the purpose and procedure of the research, including the expected duration of the study 2. A statement of any risks or discomfort associated with participation 3. A guarantee of anonymity and the confidentiality of records 4. The identification of the researcher and of where to receive information about subjects' rights or questions about the study 5. A statement that participation is completely voluntary and can be terminated at any time without penalty 6. A statement of alternative procedures that may be used 7. A statement of any benefits or compensation provided to subjects and the number of subjects involved 8. An offer to provide a summary of findings dition of continued employment. It is unethical even if someone other than the researcher (e.g., an employer) coerces people (e.g., employees) to participate in research. Full disclosure with the researcher's identification helps to protect research participants against fraudulent research and to protect legitimate researchers. Informed consent lessens the chance that a con artist in the guise of a researcher will defraud or abuse people. It also reduces the chance that someone will use a bogus researcher identity to market products or obtain personal information on people for unethical purposes. Legally, a signed informed consent statement is optional for most survey, field, and secondary data research, but it is often mandatory for experimental research. Informed consent is impossible to obtain in existing statistics and documentary research. The general rule is: The greater the risk of potential harm to research participants, the greater the need to obtain a written informed consent statement from them. In sum, there are many sound reasons to get informed consent and few reasons not to get it. Special Populations and New Inequalities Some populations or groups of research participants are not capable of giving true voluntary informed consent. Special populations are people who lack the necessary cognitive competency to give valid informed consent or people in a weak position who might cast aside their freedom to refuse to participate in a study. Students, prison inmates, employees, military personnel, the homeless, welfare recipients, children, and the develop mentally disabled may not be fully capable of making a decision, or they may agree to participate only because they see their participation as a way to obtain a desired good—such as higher grades, early parole, promotions, or additional services. It is unethical to involve "incompetent" people (e.g., children, mentally disabled, etc.) in research unless a researcher meets two 56 PART ONE / FOUNDATIONS minimal conditions: (1) a legal guardian grants ^Titten permission and (2) the researcher follows all standard ethical principles to protect participants from harm. For example, a researcher wants to conduct a survey of high school students to learn about their sexual behavior and drug/alcohol use. If the survey is conducted on school property, school officials must give official permission. For any research participant who is a legal minor (usually under 18 years old), written parental permission is needed. It is best to ask permission from each student, as well. The use of coercion to participate can be a tricky issue, and it depends on the specifics of a situation. For example, a convicted criminal faces the alternative of imprisonment or participation in an experimental rehabilitation program. The convicted criminal may not believe in the benefits of the program, but the researcher may believe that it will help the criminal. This is a case of coercion. A researcher must honestly judge whether the benefits to the criminal and to greatly outweigh the ethical prohibition : n : oercion. This is risky. History shows many cases in which a researcher believed he or she _;: mething "for the good of someone . : rosition (e.g., prisoners, students, :.:: it turned out that the "good" ii for the researcher or a powerful organization in society, and it did more harm than . . . :arch participant. : _ "■ _ - - z been in a social science class • . • . . . required you to participate as a subject m a research project. This is a special - tc is usually ethical. Teachers c-.:vients in favor of requiring smrtr^ participation: (1) it would be difficult and m^iwiin |i expensive to get participants - •-. • : ledge created from re-ing as subjects benefits futnre saaaa aad society, and (3) students will :_■ ~ ' . ?v experiencing it di- :r. setting. Of the three li uatut known about to deny people led to a study ng treatment? whether a treat-le study group iting a new in-•ants when the heir survival or the people who fed" treatment riously accept-lstead of deny-)est treatment ng tested. This Per in absolute 1 behind in rel-:rs can use a a study group ;t phase of the with the treat-; versa. Finally, itors results. If the new treat-archer should roup. Also, in high-risk experiments with medical treatments or possible physical harm, researchers may use animal or other surrogates for humans. Privacy, Anonymity, and Confidentiality How would you feel if private details about your personal life were shared with the public without your knowledge? Because social researchers sometimes transgress the privacy of people in order to study social behavior, they must take several precautions to protect research participants' privacy. Privacy. Survey researchers invade a person's privacy when they probe into beliefs, backgrounds, and behaviors in a way that reveals intimate private details. Experimental researchers sometimes use two-way mirrors or hidden microphones to "spy" on subjects. Even if people know they are being studied, they are unaware of what the experimenter is looking for. Field researchers may observe private aspects of behavior or eavesdrop on conversations. In field research, privacy may be violated without advance warning. When Humphreys (1975) served as a "watchqueen" in a public rest-room where homosexual contacts took place, he observed very private behavior without informing subjects. When Piliavin and colleagues (1969) had people collapse on subways to study helping behavior, those in the subway car had the privacy of their ride violated. People have been studied in public places (e.g., in waiting rooms, walking down the street, in classrooms, etc.), but some "public" places are more private than others (consider, for example, the use of periscopes to observe people who thought they were alone in a public toilet stall). Eavesdropping on conversations and observing people in quasi-private areas raises ethical concerns. To be ethical, a researcher violates privacy only to the minimum degree necessary and only for legitimate research purposes. In addition, he or she takes steps to protect the information on participants from public disclosure. Anonymity. Researchers protect privacy by not disclosing a participant's identity after information is gathered. This takes two forms, both of which require separating an individual's identity from his or her responses: anonymity and confidentiality. Anonymity means that people remain anonymous or nameless. For example, a field researcher provides a social picture of a particular individual, but gives a fictitious name and location, and alters some characteristics. The subject's identity is protected, and the individual remains unknown or anonymous. Survey and experimental researchers discard the names or addresses of subjects as soon as possible and refer to participants by a code number only to protect anonymity. If a researcher uses a mail survey and includes a code on the questionnaire to determine which respondents failed to respond, he or she is not keeping respondents anonymous during that phase of the study. In panel studies, researchers track the same individuals over time, so they do not uphold participant anonymity within the study. Likewise, historical researchers use specific names in historical or documentary research. They may do so if the original information was from public sources; if the sources were not publicly available, a researcher must obtain written permission from the owner of the documents to use specific names. It is difficult to protect research participant anonymity. In one study about a fictitious town, "Springdale," in Small Town in Mass Society (Vidich and Bensman, 1968), it was easy to identify the town and specific individuals in it. Town residents became upset about how the researchers portrayed them and staged a parade mocking the researchers. People often recognize the towns studied in community research. Yet, if a researcher protects the identities of individuals with fictitious information, the gap between what was studied and what is reported to others raises questions about what was found and what was made up. A researcher may breach a promise of anonymity unknowingly in small samples. For example, let us say you conduct a survey of 100 58 PART ONE / FOUNDATIONS college students and ask many questions on a questionnaire, including age, sex, religion, and hometown. The sample contains one 22-year-old Jewish male born in Stratford, Ontario. With this information, you could find out who the specific individual is and how he answered very personal questions, even though his name was not directly recorded on the questionnaire. Confidentiality. Even if a researcher cannot protect anonymity, he or she always should protect participant confidentiality. Anonymity means protecting the identity of specific individuals from being known. Confidentiality can include information with participant names attached, but the researcher holds it in confidence or keeps it secret from public disclosure. The researcher releases data in a way that does not permit linking specific individuals to responses and presents it publicly only in an aggregate form (e.g., as percentages, statistical means, etc.). A researcher can provide anonymity without confidentiality, or vice versa, although they usually go together. Anonymity without confidentiality occurs if all the details about a specific individual are made public, but the individual's name is withheld. Confidentiality without anonymity occurs if detailed information is not made public, but a researcher privately links individual names to specific responses. Attempts to protect the identity of subjects from public disclosure has resulted in elaborate procedures: eliciting anonymous responses, using a third-party custodian who holds the key to coded lists, or using the random-response technique. Past abuses suggest that such measures may be necessary. For example, Diener and Crandall (1978:70) reported that during the 1950s, the U.S. State Department and the FBI requested research records on individuals who had been involved in the famous Kinsey sex study. The Kinsey Sex Institute refused to comply with the government. The institute threatened to destroy all records rather than release any. Eventually, the government agencies backed down. The moral duty and ethical code of the researchers obligated them to destroy the records rather than give them to government officials. Confidentiality can sometimes protect research participants from legal or physical harm. In a study of illegal drug users in rural Ohio, Draus and associates (2005) took great care to protect the research participants. They conducted interviews in large multiuse buildings, avoided references to illegal drugs in written documents, did not mention of names of drug dealers and locations, and did not affiliate with drug rehabilitation services, which had ties to law enforcement. They noted, "We intentionally avoided contact with local police, prosecutors, or parole officers" and "surveillance of the project by local law enforcement was a source of concern" (p. 169). In other situations, other principles may take precedence over protecting research participant confidentiality. For example, when studying patients in a mental hospital, a researcher discovers that a patient is preparing to kill an attendant. The researcher must weigh the benefit of confidentiality against the potential harm to the attendant. Social researchers can pay high personal costs for being ethical. Although he was never accused or convicted of breaking any law and he closely followed the ethical principles of the American Sociological Association, Professor Rik Scarce spent 16 weeks in a Spokane jail for contempt of court because he refused to testify before a grand jury and break the confidentiality of social research data. Scarce had been studying radical animal liberation groups and had already published one book on the subject. He had interviewed a research participant who was suspected of leading a group that broke into animal facilities and caused $150,000 damage. Two judges refused to acknowledge the confidentiality of social research data.8 A special concern with anonymity and confidentiality arises when a researcher studies "captive" populations (e.g., students, prisoners, employees, patients, and soldiers). Gatekeepers, or those in positions of authority, may restrict access unless they receive information on sub- CHAPTER 3 / ETHICS IN SOCIAL RESEARCH 59 jects.9 For example, a researcher studies drug use and sexual activity among high school students. School authorities agree to cooperate under two conditions: (1) students need parental permission to participate and (2) school officials get the names of all drug users and sexually active students in order to assist the students with counseling and to inform the students' parents. An ethical researcher will refuse to continue rather than meet the second condition. Even though the officials claim to have the participants' best interests in mind, the privacy of participants will be violated and they could be in legal harm as a result of disclosure. If the school officials really want to assist the students and not use researchers as spies, they could develop an outreach program of their own. Mandated Protections of Research Participants Many governments have regulations and laws to protect research participants and their rights. In the United States, legal restraint is found in rules and regulations issued by the U.S. Department of Health and Human Services Office for the Protection from Research Risks. Although this is only one federal agency, most researchers and other government agencies look to it for guidance. The National Research Act (1974) established the National Commission for the Protection of Human Subjects in Biomedical and Behavioral Research, which significantly expanded regulations and required informed consent in most social research. The responsibility for safeguarding ethical standards was assigned to research institutes and universities. The Department of Health and Human Services issued regulations in 1981, which are still in force. Federal regulations follow a biomedical model and protect subjects from physical harm. Other rules require institutional review boards (IRBs) at all research institutes, colleges, and universities to review all use of human subjects. An IRB is a committee of researchers and community members that oversees, monitors, and reviews the im- pact of research procedures on human participants and applies ethical guidelines by reviewing research procedures at a preliminary stage when first proposed. Some forms of research, educational tests, normal educational practice, most nonsensitive surveys, most observation of public behavior, and studies of existing data in which individuals cannot be identified are exempt from institutional review boards. ETHICS AND THE SCIENTIFIC COMMUNITY Physicians, attorneys, family counselors, social workers, and other professionals have a code of ethics and peer review boards or licensing regulations. The codes formalize professional standards and provide guidance when questions arise in practice. Social researchers do not provide a service for a fee, they receive limited ethical training, and rarely are they licensed. They incorporate ethical concerns into research because it is morally and socially responsible, and to protect social research from charges of insen-sitivity or abusing people. Professional social science associations have codes of ethics that identify proper and improper behavior. They represent a consensus of professionals on ethics. All researchers may not agree on all ethical issues, and ethical rules are subject to interpretation, but researchers are expected to uphold ethical standards as part of their membership in a professional community. Codes of research ethics can be traced to the Nuremberg code adopted during the Nuremberg Military Tribunal on Nazi war crimes held by the Allied Powers immediately after World War II. The code, developed as a response to the cruelly of concentration camp experiments, outlines ethical principles and rights of human subjects. These include the following: ■ The principle of voluntary consent ■ Avoidance of unnecessary physical and mental suffering 60 PART ONE / FOUNDATIONS ■ Avoidance of any experiment where death or disabling injury is likely ■ Termination of research if its continuation is likely to cause injury, disability, or death ■ The principle that experiments should be conducted by highly qualified people using the highest levels of skill and care ■ The principle that the results should be for the good of society and unattainable by any other method The principles in the Nuremberg code dealt with the treatment of human subjects and focused on medical experimentation, but they became the basis for the ethical codes in social research. Similar codes of human rights, such as the 1948 Universal Declaration of Human Rights by the United Nations and the 1964 Declaration of Helsinki, also have implications for social researchers. Box 3.3 lists some of the basic principles of ethical social research. Professional social science associations have committees that review codes of ethics and hear about possible violations, but there is no formal policing of the codes. The penalty for a minor violation rarely goes beyond a letter of complaint. If laws have not been violated, the most extreme penalty is the negative publicity surrounding a well-documented and serious ethical violation. The publicity may result in the loss of employment, a refusal to publish the researcher's findings in scholarly journals, and a prohibition from receiving funding for research—in other words, banishment from the community of professional researchers. Codes of ethics do more than codify thinking and provide individual researchers with guidance; they also help universities and other institutions defend ethical research against abuses. For example, after interviewing 24 staff members and conducting observations, a researcher in 1994 documented that the staff at the Milwaukee Public Defenders Office were seriously overworked and could not effectively provide legal defense for poor people. Learning of the findings, top officials at the office contacted Basic Principles of Ethical Social Research Ethical responsibility rests with the individual researcher. Do not exploit subjects or students for personal gain. Some form of informed consent is highly recommended or required. Honor all guarantees of privacy, confidentiality, and anonymity. Do not coerce or humiliate subjects. Use deception only if needed, and always accompany it with debriefing. Use the research method that is appropriate to a topic. Detect and remove undesirable consequences to research subjects. Anticipate repercussions of the research or publication of results. Identify the sponsor who funded the research. Cooperate with host nations when doing comparative research. Release the details of the study design with the results. Make interpretations of results consistent with the data. Use high methodological standards and strive for accuracy. Do not conduct secret research. the university and demanded to know who on their staff had talked to the researcher, with implications that there might be reprisals. The university administration defended the researcher and refused to release the information, citing widely accepted codes that protect human research participants.10 ETHICS AN RESEARCH Whistle-Blow You might fin< a sponsor—an or a private fin to conduct re; arise when a sp applied researi compromise t standards as a i or for continue to set ethical b< refuse the spon with an illegiti researcher has organization c situation, or vc themselves as c or becoming i must choose hi it is best to con tionship with a up front. Whist who sees an et not stop it aft hausting inten He or she then external audie whistle-blowin that the breach of in the organ may or may nc able to help. Oi orities {making tionalizing the1 researcher's pri ical behavior), to discredit or j lems and acts c tions, the issue and create grea is moral, a whi to make sacrifi CHAPTER 3 / ETHICS IN SOCIAL RESEARCH 61 ETHICS AND THE SPONSORS OF RESEARCH Whistle-Blowing You might find a job where you do research for a sponsor—-an employer, a government agency, or a private firm that contracts with a researcher to conduct research. Special ethical problems arise when a sponsor pays for research, especially applied research. Researchers may be asked to compromise ethical or professional research standards as a condition for receiving a contract or for continued employment. Researchers need to set ethical boundaries beyond which they will refuse the sponsor's demands. When confronted with an illegitimate demand from a sponsor, a researcher has three basic choices: loyalty to an organization or larger group, exiting from the situation, or voicing opposition.11 These present themselves as caving in to the sponsor, quitting, or becoming a whistle-blower. The researcher must choose his or her own course of action, but it is best to consider ethical issues early in a relationship with a sponsor and to express concerns up front. Whistle-blowing involves the researcher who sees an ethical wrongdoing, and who cannot stop it after informing superiors and exhausting internal avenues to resolve the issue. He or she then turns to outsiders and informs an external audience, agency, or the media. The whistle-blowing researcher must be convinced that the breach of ethics is serious and approved of in the organization. It is risky. The outsiders may or may not be interested in the problem or able to help. Outsiders often have their own priorities (making an organization look bad, sensationalizing the problem, etc.) that differ from the researcher's primary concern (ending the unethical behavior). Supervisors or managers may try to discredit or punish anyone who exposes problems and acts disloyal. Under the best of conditions, the issue may take a long time to resolve and create great emotional strain. By doing what is moral, a whistle-blower needs to be prepared to make sacrifices—loss of a job or no promo- tions, lowered pay, an undesirable transfer, abandonment by friends at work, or incurring legal costs. There is no guarantee that doing the ethical-moral thing will stop the unethical behavior or protect the honest researcher from retaliation. Applied social researchers in sponsored research settings need to think seriously about their professional roles. They may want to maintain some independence from an employer and affirm their membership in a community of dedicated professionals. Many find a defense against sponsor pressures by participating in professional organizations (e.g., the Evaluation Research Society), maintaining regular contacts with researchers outside the sponsoring organization, and staying current with the best research practices. The researcher least likely to uphold ethical standards in a sponsored setting is someone who is isolated and professionally insecure. Whatever the situation, unethical behavior is never justified by the argument that "If I didn't do it, someone else would have." Arriving at Particular Findings What should you do if a sponsor tells you, directly or indirectly, what results you should come up with before you do a study? An ethical researcher will refuse to participate if he or she is told to arrive at specific results as a precondition for doing research. Legitimate research is conducted without restrictions on the possible findings that a study might yield. An example of pressure to arrive at particular findings is in the area of educational testing. Standardized tests to measure achievement by U.S. school children have come under criticism. For example, children in about 90 percent of school districts in the United States score "above average" on such tests. This was called the Lake Wobegon effect after the mythical town of Lake Wobegon, where, according to radio show host Garrison Keillor, "all the children are above average." The main reason for this finding was that the researchers compared scores of current stu- 62 PART ONE / FOUNDATIONS dents with those of students many years ago. Many teachers, school principals, superintendents, and school boards pressured for a type of result that would allow them to report to parents and voters that their school district was "above average."12 Limits on How to Conduct Studies. Is it ethically acceptable for a sponsor to limit research by defining what a researcher can study or by limiting the techniques used? Sponsors can legitimately set some conditions on research techniques used (e.g., survey versus experiment) and limit costs for research. However, the researcher must follow generally accepted research methods. Researchers must give a realistic appraisal of what can be accomplished for a given level of funding. The issue of limits is common in contract research, when a firm or government agency asks for work on a particular research project. There is often a tradeoff between quality and cost. Plus, once the research begins, a researcher may need to redesign the project, or costs may be higher. The contract procedure makes midstream changes difficult. A researcher may find that he or she is forced by the contract to use research procedures or methods that are less than ideal. The researcher then confronts a dilemma: complete the contract and do low-quality research, or fail to fulfill the contract and lose money and future jobs. A researcher should refuse to continue a study if he or she cannot uphold generally accepted standards of research. If a sponsor demands a biased sample or leading survey questions, the ethical researcher should refuse to cooperate. If a legitimate study shows a sponsor's pet idea or project to be disaster, a researcher may anticipate the end of employment or pressure to violate professional research standards. In the long run, the sponsor, the researcher, the scientific community, and society in general are harmed by the violation of sound research practice. The researcher has to decide whether he or she is a "hired hand" who always gives the sponsors whatever they want, even if it is ethically wrong, or a professional who is obligated to teach, guide, or even oppose sponsors in the service of higher moral principles. A researcher should ask: Why would sponsors want the social research conducted if they are not interested in using the findings or in the truth? The answer is that some sponsors are not interested in the truth and have no respect for the scientific process. They see social research only as "a cover" to legitimate a decision or practice that they plan to carry out, but use research to justify their action or deflect criticism. They abuse the researcher's professional status and undermine integrity of science to advance their own narrow goals. They are being deceitful by trying to "cash in" on social research's reputation for honesty. When such a situation occurs, an ethical researcher has a moral responsibility to expose and stop the abuse. Suppressing Findings What happens if you conduct a study and the findings make the sponsor look bad, then the sponsor does not want to release the results? This is a common situation for many applied researchers. For example, a sociologist conducted a study for a state government lottery commission on the effects of state government-sponsored gambling. After she completed the report, but before releasing it to the public, the commission asked her to remove sections that outlined the many negative social effects of gambling and to eliminate her recommendations to create social services to help the anticipated increase of compulsive gamblers. The researcher found herself in a difficult position and faced two conflicting values: do what the sponsor requested and paid for, or reveal the truth to the public but then suffer the consequences?13 Government agencies may suppress scientific information that contradicts official policy or embarrasses high officials. Retaliation against social researchers employed by government CHAPTER 3 / ETHICS IN SOCIAL RESEARCH S3 agencies who make the information public also occurs. In 2004, leading scientists, Nobel laureates, leading medical experts, former federal agency directors, and university chairs and presidents signed a statement voicing concern over the misuse of science by the George W. Bush administration. Major accusations included su-pressing research findings and stacking scientific advisory committees with ideologically committed advocates rather than impartial scientists. Other complaints included limiting the public release studies on auto-saftey data, negative data about pharmaceuticals, and studies on pollution. These involved industries that were major political campaign supporters of the administration. Additional criticisms appeared over removing a government fact sheet citing studies that showed no relationship between abortions and breast cancer, removing study results about positive effects of condom use in pregnancy prevention, holding back information on positive aspects of stem cell research, and requiring researchers to revise their study findings on dangers of arctic oil drilling and endangered species so they would conform to the administration's political agenda. An independent 2005 survey of 460 biologists who worked for Fisheries Service found that about one-third said they were directed to suppress findings for nonscientific reasons or to inappropriately exclude or alter technical information from an official scientific document. In June 2005, it was discovered that a political appointee without scientific training who had previously been an oil industry lobbyist was charged with editing official government reports to play down the research findings that documented linkages between such emissions and global warming.14 In sponsored research, a researcher can negotiate conditions for releasing findings prior to beginning the study and sign a contract to that effect. It may be unwise to conduct the study without such a guarantee, although competing researchers who have fewer ethical scruples may do so. Alternatively, a researcher can accept the sponsor's criticism and hostility and release the findings over the sponsor's objections. Most researchers prefer the first choice, since the second one may scare away future sponsors. Social researchers sometimes self-censor or delay the release of findings. They do this to protect the identity of informants, to maintain access to a research site, to hold on to their jobs, or to protect the personal safety of themselves or family members.15 This is a less disturbing type of censorship because it is not imposed by an outside power. It is done by someone who is close to the research and who is knowledgeable about possible consequences. Researchers shoulder the ultimate responsibility for their research. Often, they can draw on many different resources but they face many competing pressures, as well. Concealing the True Sponsor Is it ethical to keep the identity of a sponsor secret? For example, an abortion clinic funds a study on members of religious groups who oppose abortion, but it tells the researcher not to reveal to participants who is funding the study. The researcher must balance the ethical rule that it is usually best to reveal a sponsor's identity to participants against both the sponsor's desire for confidentiality and reduced cooperation by participants in the study. In general, an ethical researcher will tell subjects who is sponsoring a study unless there is a strong methodological reason for not doing so. When reporting or publishing results, the ethical mandate is very clear: A researcher must always reveal the sponsor who provides funds for a study. POLITICS OF RESEARCH Ethics largely address moral concerns and standards of professional conduct in research that are under the researcher's control. Political concerns also affect social research, but many are be- 54 PART ONE / FOUNDATIONS yond the control of researchers. The politics of research usually involve actions by organized advocacy groups, powerful interests in society, governments, or politicians trying to restrict or control the direction of social research. Historically, the political influence over social research has included preventing researchers from conducting a study, cutting off or redirecting funds for research, harassing individual researchers, censoring the release of research findings, and using social research as a cover or guise for covert government intelligence/military actions. For example, U.S. Congress members targeted and eliminated funding for research projects that independent panels of scientists recommended because Congress did not like the topics that would be studied, and politically appointed officials shifted research funds to support more studies on topics consistent with their political views while ending support for studies on topics that might contradict their views. A large company threatened an individual researcher with a lawsuit for delivering expert testimony in public about research findings that revealed its past bad conduct. Until about a decade ago, social researchers who appeared to be independent were actually conducting covert U.S. government intelligence activities.16 Most uses of political or financial influence to control social research share a desire to limit knowledge creation or restrict the autonomous scientific investigation of controversial topics. Attempts at control seem motivated by a fear that researchers might discover something damaging if they have freedom of inquiry. This shows that free scientific inquiry is connected to fundamental political ideals of open public debate, democracy, and freedom of expression. The attempts to block and steer social research have three main reasons. First, some people defend or advance positions and knowledge that originate in deeply held ideological, political, or religious beliefs, and fear that social researchers might produce knowledge that contradicts them. Second, powerful interests want to protect or advance their political-financial position, and fear social researchers might yield findings showing that their actions are harmful to the public or some sectors of society. And third, some people in society do not respect the ideals of science to pursue truth/ knowledge and instead view scientific research only as cover for advancing private interests (see Box 3.4). VALUE-FREE AND OBJECTIVE RESEARCH You have undoubtedly heard about "value-free" research and the importance of being "objective" in research. This is not as simple at it might first appear for several reasons. First, there are different meanings of the terms value free and objective. Second, different approaches to social science (positivism, interpretative, critical) hold different views on the issue. And last, even researchers who agree that social research should be value free and objective do not believe that it needs to be totally devoid of all values. There are two basic ways the term value free is used: research that is free from any prior assumptions, theoretical stand, or value position, and research that is conducted free of influence from an individual researcher's personal prejudices/beliefs. Likewise, objective can mean focusing only on what is external or visible, or it can mean following clear and publicly accepted research procedures and not haphazard, personal ones. The three approaches to social science that you read about in Chapter 2 hold different positions on the importance of value-free, objective research. Positivism puts a high value on such research. An interpretive approach seriously questions whether it is possible, since human values/beliefs pervade all aspects of human activities, including research. Instead of eliminating values and subjective dimension, it suggests a relativist stance—no single value position is bet- CHAPTER 3 / ETHICS IN SOCIAL RESEARCH 65 What Is Public Sociology? Michael Burawoy (2004, 2005) distinguished among four ideal types of social research: policy, professional, critical, and public. The aim of public sociology (or social science, more generally) is to enrich public debate over moral and political issues by infusing such debate with social theory and research. Public sociology frequently overlaps with action-oriented research. Burawoy argued that the place of social research in society centers on how one answers two questions: Knowledge for whom? and Knowledge for what? The first question focuses on the sources of research questions and how results are used. The second question looks at the source of research goals. Are they handed down by some external sponsor or agency or are they concerned with debates over larger societal political-moral issues? Public social science tries to generate a conversation or debate between researchers and public. By con-strast, policy social science focuses on finding solutions to specific problems as defined by sponsors or clients. Both rely on professional social science for theories, bodies of knowledge, and techniques for gathering and analyzing data. Critical social science, as was discussed in Chapter 2, emphasizes demystifying and raising questioning about basic conditions. The primary audience for professional and critical social science are members of the scientific community, whereas the main audience for public and policy research are nonexperts and practitioners. Both critical and public social science seek to infuse a moral, value dimension into social research and they try to generate debates over moral-political values. Professional and policy social science are less concerned about debates over moral or value issues and may avoid them. Instead, their focus is more on being effective in providing advances to basic knowledge or specific solutions to practical problems. Both public and policy social science are applied research and have a relevance beyond the community of scientific researchers. ter than any other. A critical approach also questions value-free research, but sees it often as a sham. Value free means free of everyone's values except those of science, and objective means following established rules or procedures that some people created, without considering who they represent and how they created the rules. In other words, a critical approach sees all research as containing some values, so those who claim to be value free are just hiding theirs. Those who follow an interpretive and critical approach and reject value-free research do not embrace sloppy and haphazard research, research procedures that follow a particular researcher's whims, or a study that has a foregone conclusion and automatically supports a specific value position. They believe that a researcher should make his or her own value position explicit, reflect carefully on reasons for doing a study and the procedures used, and communicate in a candid, clear manner exactly how the study was conducted. In this way, other researchers see the role of a researcher's values and judge for themselves whether the values unfairly influenced a study's findings. Even highly positivist researchers who advocate value-free and objective studies admit a limited place for some personal, moral values. Many hold that a researcher's personal, moral position can enter when it comes to deciding what topic to study and how to disseminate findings. Being value free and objective only refers to actually conducting the study. This means that you can study the issues you believe to be important and after completing a study 66 PART ONE / FOUNDATIONS you can share the results with specific interest groups in addition to making them available to the scientific community. CONCLUSION In Chapter 1, we discussed the distinctive contribution of science to society and how social research is a source of knowledge about the social world. The perspectives and techniques of social research can be powerful tools for understanding the world. Nevertheless, with that power to discover comes responsibility—a responsibility to yourself, a responsibility to your sponsors, a responsibility to the community of scientific researchers, and a responsibility to the larger society. These responsibilities can conflict with each other. Ultimately, you personally must decide to conduct research in an ethical manner, to uphold and defend the principles of the social science approach you adopt, and to demand ethical conduct by others. The truthfulness of knowledge produced by social research and its use or misuse depends on individual researchers like you, reflecting on their actions and on the serious role of social research in society. In the next chapter, we examine basic design approaches and issues that appear in both qualitative and quantitative research. Key Terms anonymity confidentiality crossover design informed consent institutional review board (IRB) plagiarism principle of voluntary consent public sociology research fraud scientific misconduct special populations whistle-blower Endnotes 1. For a discussion of research fraud, see Broad and Wade (1982), Diener and Crandall (1978), and Weinstein (1979). Hearnshaw (1979) and Wade (1976) discuss the Cyril Burt case, and see Holden (2000) on the social psychologist case. Kusserow (1989) discusses the concept of scientific misconduct. 2. See Blum (1989) andD'Antonio (1989) for details on this case. Also see Goldner (1998) on legal versus scientific views of misconduct. Gibelman (2001) discusses several cases and the changing definition of misconduct. 3. See Lifton (1986) on Nazi experiments, and Williams and Wallace (1989) discuss Japanese experiments. Harris (2002) argues that the Japanese experiments were more horrific, but the United States did not prosecute the Japanese scientists as the Germans were because the U.S. military wanted the results to develop its own biological warfare program. 4. See Jones (1981) and Mitchell (1997) on "Bad Blood." 5. Diener and Crandall (1978:128) discuss examples. 6. A discussion of physical harm to research participants can be found in Kelman (1982), Reynolds (1979,1982), and Warwick (1982). 7. For a discussion, see Diener and Crandall (1978:21-22) and Kidder and Judd (1986:481-484). 8. See Monaghan (1993a, 1993b, 1993c). 9. Broadhead and Rist (1976) discuss gatekeepers. 10. See "UW Protects Dissertation Sources," Capital Times (Madison, Wisconsin), December 19,1994, p. 4. 11. See Hirschman (1970) on loyalty, exit, or voice. 12. See Edward Fiske, "The Misleading Concept of 'Average' on Reading Test Changes, More Students Fall Below It," New York Times (July 12, 1989). Also see Koretz (1988) and Weiss and Gruber (1987). 13. See "State Sought, Got Author's Changes of Lottery Report," Capital Times (Madison, Wisconsin), July 28, 1989, p. 21. 14. Andrew Revkin, "Bush Aide Edited Climate Reports," New York Times (June 8, 2005). "White House Calls Editing Climate Files Part of Usual Review," New York Times (June 9, 2005). Union of Concerned Scientists, "Politics Trumps Science at CHAPTER 3 / ETHICS IN SOCIAL RESEARCH 67 U.S. Fish and Wildlife Service" (February 9, 2005)." Specific Examples of the Abuse of Science www.ucsusa.org/global_environment/rsi/page.cf m?pageID=1398, downloaded August 3, 2005. "Summary of National Oceanic & Atmospheric Administration Fisheries Service Scientist Survey" by Union of Concerned Scientists (June 2005). E. Shogren, "Researchers Accuse Bush of Manipulating Science," Los Angeles Times (July 9, 2004). Jeffrey McCracker, "Government Bans Release of Auto-Safety Data," Detroit Free Press (August 19, 2004). Garddiner Harris, "Lawmaker Says FDA Held Back Drug Data," New York Times (Septem- ber 10, 2004). James Glanz, "Scientists Say Administration Distorts Facts," New York Times (February 19, 2004). Dylan O. Krider, "The Politi-cization of Science in the Bush Administration," Skeptic Vol. 11, Number 2 (2004) at www. Skep-tic.com. C. Orstein, "Politics Trumps Science in Condom Fact Sheet," New York Times (December 27, 2002). "Scientist Says Officials Ignored Advice on Water Levels," Washington Post (October 29, 2002). 15. See Adler and Adler (1993). 16. See Neuman (2003, Chapter 16) for a discussion of political issues in social research.