03/04/2012 § 1 Comment
The results in a new sociology article have been making waves in the blogosphere. Gordon Gauchat found that conservatives in the US have become less trusting of science. This was a test of a thesis from Chris Mooney. Some quotes:
Using data from the 1974 to 2010 General Social Survey, I examine group differences in trust in science and group-specific change in these attitudes over time. Results show that group differences in trust in science are largely stable over the period, except for respondents identifying as conservative. Conservatives began the period with the highest trust in science, relative to liberals and moderates, and ended the period with the lowest.
Mooney (2005) (from the blurb):
On a broad array of issues-stem cell research, climate change, missile defense, abstinence education, product safety, environmental regulation, and many others-the Bush administration’s positions fly in the face of overwhelming scientific consensus…. This is not unique to the Bush administration, but it is largely a Republican phenomenon, born of a conservative dislike of environmental, health, and safety regulation, and at the extremes, of evolution and legalized abortion.
I have seen something similar in my research, something broader than Gauchat’s findings. I started from the position that Slovic laid out, I think in the intro or first chapter of The Perception of Risk (2000). He called it the ‘white male’ bias. In research on risk, he found that there were ethnic and gender differences, but they interacted. He found that white males had different risk assessments than everybody else.
In my work on technology, I found sort-of similar results. But it was more subtle than that. Middle- to high-income males, particularly white males, seemed to have more positive views of new technology than everyone else. I played around with cross-tabs for a while — it was a minor issue in my research — and never quite resolved where the splits were. Other people’s work elsewhere suggested the same effect.
Because facts aren’t facts. We interpret them, we manipulate them, we try to make sense of them — and, chiefly, we insert them into narratives as best we can. With each piece of information, we ask, how does this particular fact fit with my identity and worldview? As a result, belief or trust in science is a badge we wear. We signal to the world who we are by the stance we take on science issues.
Where is this headed? The ‘white male’ bias is inherently trusting of official science. The opposite position, which tends to be correlated with females and non-whites, is sceptical. Now, US conservatives are becoming less trusting of science. It looks like the support for science — the official, lab-coated, expert-centred variety — is eroding. The ‘white male’ bias was always a minority position (although a privileged one). But now it is losing a key bloc — the conservative white male and his allies.
I can see potential good coming from this splintering. Slovic and others have shown that the experts aren’t necessary right. They do make mistakes, they engage in groupthink, they are subject to cognitive biases. Broadening the discussion about all kinds of science issues could make it more inclusive. That’s the optimistic position. The pessimistic position is that people with the strongest views and loudest voices (and deepest pockets) will determine things for the rest of us, regardless.