3.2 Interests, (implicit) biases, and uncertainty

Much of the discussion about conflict of interest in academia-industry collaboration focuses on researchers’ financial connection with the industry. Financial interest, while important, is not the only case in which researchers’ own interests and biases influence their interpretation of data. As human beings, we are all “biased” in the sense that we view the world through particular perspectives that are shaped by our life history, education, and cultural traditions. Researchers are no exceptions. For example, studies of organizational culture find that teams in Asian cultures assign higher priority to avoiding confrontations than teams in the United States (Glinow, Shapiro, and Brett 2004). While this finding does not suggest that Asian and American researchers interpret the same data set differently, it is plausible to assume that research group meetings in Asian countries and in the U.S. have different dynamics, and such difference might impact the ways in which ta group collectively reviews the data collection and analysis (e.g., one team might be more skeptical and critical of the data handling than the other).

Sometimes researchers’ unexamined beliefs, or the assumptions they make unconsciously, might also affect the way they process data and information. This happens even when the researchers do not deliberately accept certain prejudices or stereotypes. Psychologists and philosophers have examined people’s “implicit bias,” a subtle cognitive process through which stereotypes and cultural images affect one’s understanding and judgment at an unconscious level. For example, a study of over two hundred physicians shows that while the physicians reported no “explicit preference for white versus black patients,” a statistically significant proportion of them implicitly believed black Americans are “less cooperative,” and this belief was correlated with an increased likelihood of not treating black patients with thrombolysis (Green et al., 2007). In addition to human biases, boyd (2016) reminds us that data itself can be biased, incomplete, and misleading, and as a result, analysis that is based on biased data sets can lead to distorted conclusions. Recently Microsoft had to pull a chatting robot offline after it learned a plethora of racist and hateful language within 24 hours of its public debut. Biased data led to biased learning.

Although as humans we are all characterized by our own particularities and limitations, as researchers we have a duty not to let our interests and biases drive our interpretation of research data. Recognizing and disclosing our stances, motivations, and partiality is a first step toward managing their impact on data interpretation: such reflections help us examine the possible ways in which our own biases shape the perspectives we take. In addition, frank disclosure of researchers’ interests and standpoints allows the reviewers and readers of research publications to comprehensively evaluate the authors’ claims. Besides self-reflection, proper training also helps alleviate the impact of implicit bias.

It is also important to keep in mind that uncertainty is an inherent part of research. Given our best intentions and efforts, experiments might produce erroneous results, equipment has a limit in its accuracy, and many scientific theories are merely our best approximate interpretation of the reality at present. The principle of integrity requires that we do not overstate the certainty of our results. Being honest about the uncertainty and limitations of our research is especially important when we report findings to non-academic audience. In this case, recognizing the uncertainty of our research findings not only reduces misinterpretation of the research but also protects the credibility of the research community.

 

(This case is included in Brian Schrag, ed., Research Ethics: Cases and Commentaries, Volume 5, Bloomington, Indiana: Association for Practical and Professional Ethics, 2001. The full case study is featured by the National Academy of Engineering Online Ethics Center. Use of this case is permitted by Association for Practical and Professional Ethics)

 

A pHish Tale

Tom is a postdoc who is participating in a government-funded project to study the pH levels in a series of lakes scattered throughout an area of 100 square miles. The study was conducted because the numbers of fish in some of the lakes had been dropping, and the EPA wanted to know what was causing the fish to die. Data from Tom’s study indicate that a number of lakes have alarmingly low pH levels, although some have normal pH levels. High acidity (low pH) is known to be deadly to many fish species.

Because of the large area affected, Tom believes that the contamination must be traveling through the air. He is almost certain that the low pH levels are due to acid rain caused by emissions from power plants in the surrounding region. However, the data from his study are not sufficient to show that the power plant emissions are causing the lakes’ acidity. Another five-year project is planned to determine the causes of the acidity.

Unfortunately, some fish species that are sensitive to pH levels have died off in the lakes. If pH levels continue to fall, most fish will disappear, harming not only the ecosystem, but the local economies of some lakeside villages, where fishermen rely on the fish from the lakes for their livelihood. Tom is concerned that if something is not done about the pollution source immediately, the lakes may suffer permanent damage.

One of Tom’s long-time friends is a member of a local environmental group that wants the power plants to move. His friend suggests Tom meet with Susan, the leader of the group, but Tom is not sure whether it is appropriate to become involved in local politics, especially since data to determine the actual cause of the pollution have not yet been collected.

Tom and Richard, a senior research scientist on the study, are publishing their findings in a national journal; it is unlikely that the locals will see this publication. They have discussed the next phase of the research, and Tom knows that Richard also believes that the power plants are the most likely cause of the contamination. Tom decided to discuss his concerns about the fish with Richard and ask his advice on whether he should help the environmental group by speaking out against the power plants.

When Tom talked to Richard, Richard expressed concern about any involvement with the environmental groups. “Tom,” he said, “I’ve seen how many of those groups operate. They have no use for science unless it fits into their agenda. Many of the so-called leaders of those groups just want to get their name in the newspaper.”

“But Richard, I know some of these people, and they’re not like that,” Tom replied. “I don’t know, Tom. We have some responsibility as scientists to be objective and stay neutral in such a debate. If we start to take sides, our work will be questioned, and we risk not being taken seriously. I’ve known a few scientists who have become activists, and if they hadn’t already established a strong reputation in their field, their reputations among scientists were often tainted by their perceived subjectivity. Sometimes, their ‘cause’ was even harmed and their activism backfired because their work was painted as biased. What happens if you speak out against the power plants and we find out that there is another cause for the acidity?”

Tom replied, “I see what you mean. I don’t want to be seen as biased. Still, I feel I have some responsibility to try to save the fish for the sake of the people that rely on them and for the ecosystems that support them. Do you really think that there might be another cause for the acidity?” “No,” said Richard. “I think it’s pretty unlikely. Still, your reputation may be damaged whether you’re wrong or not.” Tom thanked Richard for his advice, but he still felt that he had some responsibility to the fish and the fishermen.

 

Questions for Case Analysis

  1. How should Tom fulfill his ethical responsibilities to the local community and to the research community?
  2. Use the 12-step approach or the Design-Based Framework to analyze this case.

 

It might be more difficult to remain our cognitive humility in the era of big data, when our ability to collect and analyze data seems boundless. Yet the very power of big data provides a strong reason for questioning and caution. Instead of following a dominant methodology, thoughtful researchers would always carefully evaluate their projects and choose methods most appropriate for their research questions. Morozov (2014) reminds us that over-reliance on big data might bias our attention toward “data-rich” questions, ones that are more likely to be solved by data analysis, while more important questions that are less “data-rich” get ignored. For example, some smartphone apps use our physical data to predict the health effects of individual lifestyle choices, such as diet and exercise, but such apps pay no attention to broader context factors like environmental and social impacts on individual health. As a result, developers of these apps arguably “steer” the users’ attention to individual choices and promote an individualist conception of health (Morozov, 2014). In this case, one might question whether the app developers have a duty to disclose the “steering effect.” Furthermore, although the development of smartphone apps is not subject to the Common Rule for human subject research, the spirit of “respect for persons” might still apply here. That is, the users have a right to know about the potential effects of such apps on their decisions and actions.