[an error occurred while processing this directive]

Combating impressions of ideology bias in public-policy research

'Naive realists' and 'attitude attributors' among the general public assume researcher bias when they don't care for a study's findings

| 05 February 2009

Robert MacCounRobert MacCoun says American social scientists should try to understand how their work is perceived to root out assumptions and communicate better. (Peg Skorpinski photo)

When people learn about research findings that conflict with their own beliefs about politically controversial topics, they not only doubt the conclusions but question the researcher's objectivity.

"Findings that support our political beliefs are seen as objective facts about the world," says Robert MacCoun, professor of public policy, law, and psychology. "But study outcomes that conflict with our views are more likely to be seen as expressions of an ideological bias by the researcher. This "attitude-attribution effect" turns upside down the notion of social scientists as impartial arbiters of the truth.

MacCoun's assertions are based on research he conducted for an article being published this month in the journal Political Psychology. An online version is now available.

While other researchers have examined whether people see what they want to see in policy reports, MacCoun says his study breaks new ground by extending the research to the views of the general public (rather than students, as done in most surveys) and by examining people's assumptions about the politics of the researcher.

"Our findings raise concerns about how social science researchers are seen by the public," wrote MacCoun in the journal article. "Because researchers' ideological views are supposed to be irrelevant to their empirical results, even partial support for the attitude-attribution effect is impressive and troubling."

Do these findingssurprise you?

MacCoun's experiment was embedded in a telephone survey conducted in 2003 with 1,050 randomly selected California adults. Respondents were tested for several psychological biases, including naοve realism (where people believe that others with different beliefs are not reality-based), assimilation (where heightened skepticism about findings doesn't necessarily infer bias), and attitude attribution (where study findings are attributed to motives, traits and politics rather than facts).

Survey participants were told about a hypothetical study on one of four politicized topics. Two of the topics — gun control and medical marijuana — involved policy generally more favored by liberals and Democrats. Two other topics — school vouchers and the death penalty — included study outcomes more likely to be supported by conservatives and Republicans. Capital punishment and medical marijuana are part of California's legal status quo, while gun control and school-voucher proposals have met with less law-making success.

A fifth hypothetical study involving nutrition advertising was used to establish a relatively apolitical baseline.

Participants were asked how surprised they were by the study findings presented to them, how believable they found the conclusions, and about the political views and motives of the study's author. They also were queried about their own attitudes about the study topics and about their personal politics and affiliations.

Besides being more skeptical about findings that contradicted prior beliefs, survey participants — especially those with conservative beliefs — tended to attribute studies with liberal findings to liberal researchers. They were less likely to conclude that conservative findings were due to a researcher's conservatism.

More than half (56 percent) of the participants in MacCoun's study speculated about the imaginary researcher's politics and were almost twice as likely (21 percent) to assume the author to be liberal as they were to infer conservatism (12 percent). Social scientists are, in fact, more likely to report being liberal than conservative, but MacCoun notes that study participants invoked this fact selectively and that conservatives were most likely to cite the author's liberalism when they didn't like a finding.

While a majority of participants found the study's purported research results to be completely or somewhat believable and the suspected biases assessed as fairly modest, policy researchers writing up their findings need to be sensitive to potential suspicions and work harder to develop trust, says MacCoun. Researchers are "not philosopher kings," he points out, and so should accept that disagreeable research findings are often held to higher standards, particularly in a 24-hour-news-cycle world where the general public — and even politicians and policymakers — rely on quick headlines, snapshots, and commentators' "punch lines."

When scholars wink

Whether research is indeed biased is a valid question, MacCoun says, acknowledging that he has encountered the raised eyebrow and knowing winks even from colleagues within academia.

When he published research about decriminalizing marijuana and co-authored the book Drug War Heresies: Learning From Other Vices, Times and Places, he interpreted chuckles and knowing looks as signals that others thought he smoked pot. Later, when his investigations showed there would be negligible adverse impact from accepting openly gay members into the U.S. military, he was met with not-that-subtle comments and questions that failed to disguise assumptions about his sexuality.

And when he joined the Berkeley faculty in 1993 after working as a behavioral scientist and policy analyst at the non-profit, non-partisan RAND think tank, many of his new associates automatically assumed he was a conservative because RAND is active in national-security research.

"If we really want to inform citizens and affect public policy, American social scientists need to learn more about how conservatives view our research," MacCoun says, "in order to root out hidden assumptions and communicate our research more effectively."