Understanding Climate Risk

Science, policy and decision-making

Science in a democratic society

with 7 comments

This post was precipitated by several stoushes held at Larvatus Prodeo over climate science but reflects more widely on the state of climate science and its public perception in the English-speaking democracies. It’s an issue I’ve been interested in for a number of years because of the attacks on climate science and the need to build better links between science and the risk management of global change. The post title is also the title of Phillip Kitcher’s new book released earlier this month. Kitcher is an English philosopher based at Colombia University and in 2006 won the Prometheus Prize of the US Philosopher’s Association – this book is the result. He has made his case really well.

The main points of this piece are that:

  • Society needs to draw from a body of public knowledge in order to be successful. Psychological and cognitive limitations lead to the sum of individual decisions producing suboptimal outcomes.
  • Attacks on public knowledge driven by self-interest and opaque values are being made under the cover of free speech and individual freedoms. The evidence used by these attacks is generally untrue, distorted or selective or fails basic tests for scientific proof.
  • Science is a values-driven enterprise. Those values need to be made explicit in what Kitcher refers to as well-ordered science.
  • Science is secular. Passing certain probative (proof) tests allows it to be shared as knowledge that has claims to objectivity.
  • Belief is personal and can also be shared but does not require the same tests (Belief also expresses a set of human needs not necessarily addressed by science).
  • Public knowledge in English-speaking democracies has become degraded. Science is vulnerable to vulgar democracy, where under the guise of free speech, any belief can masquerade as knowledge.
  • Science also needs to become better ordered, through measures that cover:
    • Education – for most students teaching what science does and what its impacts are, rather than how it works (technical), by separating pedagogy into liberal education and technical specialisation. This works on the presumption that most people need to understand the role science plays in society while fewer will become actual scientists.
    • Bringing people into the scientific workplace to familiarise them with knowledge goals and probative values and methods of certifying science.
    • Avoidance of universal punditry (experts speaking beyond their expertise) and overconfidence in findings in favour of communicating scientific evidence with the appropriate levels of confidence in theory and uncertainties in outcomes.
    • A process that steps through claims of consensus, consequences of those claims, ethical exploration of a potential policy framework and an exploration of how current actions can be balanced against future harm.

Science has been a hugely successful human project, yielding great benefits. The scale of that success also means that our woes are much larger in scale than they otherwise would be. Science reached its peak authority during 20th century modernism with scientists providing advice on matters affecting energy, agriculture, health and technology. It reached an excess in scientism, science practised as ideology, where the logic of scientific discovery provides progress with certainty. Scientism is still around but is no longer dominant, though some believe it to be. More relevant is the erosion of scientific authority. This has been well documented by Oreskes and Conway on climate change, Gould and others on evolution, and the Erlichs on ecology, to name a few. Postmodernism has also contributed to this erosion. Bruno Latour:

the danger would no longer be coming from an excessive confidence in ideological arguments posturing as matters of fact—as we have learned to combat so efficiently in the past—but from an excessive distrust of good matters of fact disguised as bad ideological biases! While we spent years trying to detect the real prejudices hidden behind the appearance of objective statements, do we now have to reveal the real objective and incontrovertible facts hidden behind the illusion of prejudices?

My answer to that question is yes.

Kitcher recognises that society has divisions in epistemic labour. That is, different specialised groups have different ways of organising knowledge (He distinguishes between epistemic and cognitive division – how those groups develop and certify knowledge, rather than what they know). We do not however, cede power to specialised non-elected groups in authoritarian rule. But we also recognise that even though in a democracy everyone has a right to free speech and to elect a representative, not all knowledge is equally correct. Somewhere in those extremes between authoritarian rule and everything goes lies a solution: We urgently need a theory of the place of Science in a democratic society—or if you like of the ways in which a system of public knowledge should be shaped to promote democratic ideals (Kitcher, p25).

Postmodernism recognised that science is a social project. While science has claims to objectivity, theories are arrived at via argument, testing and consensus. There is no single method; rather, there are a set of methods, alternative theories and those theories are often incommensurate. Kitcher (p. 36): Reductive programs fail because they are insensitive to the—reasonable—value-judgements pervading scientific practice.

Therefore, science is not value free. Commitments to factual claims and to value judgements co-evolve (ibid). This means that science is embedded in an ethical framework. Kitcher proposes three levels:

  1. The broad scheme of values that society holds;
  2. The personal set of values that relates to an individual’s knowledge goals (a cognitive set of values);
  3. A probative set of values – which problems are most important and which rules best validate/invalidate scientific conclusions?

The latter set contains most of what we find familiar about science: peer review, publishing, plagiarism, data integrity, testing claims and the nature of research programs from individual to international scale. When scientists claim science to be value free, this is an argument of convenience because the value system imbued in science is messy. Where there is broad agreement on the probative values amongst scientists, their personal values become less relevant. For example, five scientists – a political liberal, political conservative, Christian, Muslim and atheist may all agree on a given scientific conclusion because they largely agree on a common set of probative values. One may be in it for the money (though there are many better ways to cash in), another for the reputation, and yet another because she likes solving tough problems in beautiful ways. Thus, the probative set of values can be separated from the personal set of values in important aspects.

The erosion of scientific authority has been in part a reaction to the over-confidence of modernism, science as progress and scientism. Science plays a role in the risk society both as a cause of risk and in recognising and managing risk. In the post-GFC environment uncertainty surrounding the future economy, climate, the environment and technology, have left democracies very unsettled about the future. Science contributes to that apprehension as both a cause and a potential solution. What then, is its role?

Kitcher proposes four levels of engagement that can be developed:

  • Level 1 requires well-ordered science to make claims of consensus around specific issues. This involves making open the process of investigation and of the value judgements used in exploring the science.
  • Level 2 involves a detailed and precise account of current courses of action.
  • Level 3 develops an ethical framework within which policies for apportioning measures would be addressed.
  • Level 4 discusses under terms of mutual engagement how we are prepared to balance current actions against the risk of future suffering.

He also nominates four negatives that need to be overcome:

  • Present competition amongst scientists and in fields of science is constrained by historical contingencies that no longer reflect human needs.
  • The flaws of vulgar democracy are inherited by existing systems of public input.
  • Privatisation of scientific research will probably make matters worse. Markets have insufficient vision to fully address human and environmental needs. This can be pursued through a wider investigation of economic welfare where access to goods and services are seen as means rather than ends.
  • Current scientific research neglects those who are not affluent. The value base behind science includes equity, fairness and access which all need to be addressed.

The declaration of values in the latter two negative and positive points will immediately attract opposition, especially from ideological positions that see no values expressed in society beyond decisions made by individuals in the marketplace. It would be useful to have those specific claims declared and tested within the light of what Kitcher refers to as the ethical project: the scheme of values that humans have developed and applied over time.

Certification of claims can provide against fraud and misrepresentation, especially collusive fraud. Kitcher’s solution is to retain data and records, and establish monitoring. This is accepted practise but not always enforced. However, the cost is borne by the scientific community, so the measures required to undertake that monitoring should be decided by the same community in wide-ranging discussions, including young scientists. It may involve spot checks and producing records on demand. Outsiders need only become involved if there is a credible threat of distortion of policy and then the claims can be tested through extensive replication of findings.

He suggests this is optimistic and practically difficult. That is true – it also requires making clear what is inappropriate in ensuring probity. Fishing expeditions, charges from political figures, double jeopardy (reaniminating the zombie of failed past attempts), investigations of casual correspondence such as email without grounded evidence are all inappropriate. A person’s individual cognitive values are secondary to how they address probative values, yet this is often not how misrepresentation in science is viewed. A person doesn’t have to be likeable to produce good science. Although universities have codes of conduct, they are not usually enforced for misrepresentation of scientific findings, only for plagiarism and fraud. For example, the misrepresentations of scientific findings in Ian Plimers’ book Heaven and Earth as documented by Ian Enting have not been addressed by any institution. Nor do these rules necessarily apply to industry R&D and private individuals. To broaden this, codes of conduct and ethics could be established via Academies and through the International Council of Scientific Unions and/or the InterAcademy Council. However, these organisations are very conservative perhaps prompting Kitcher’s advocacy of involving younger scientists in such enterprises.

Science needs to remain a broad church, sustaining dissent. The strength of science is in testing a wide range of hypotheses and methods, with only a few succeeding. Contrarian approaches to consensus need to be encouraged in order to test assumptions that may be lazily held within that consensus. Failure within well-ordered science therefore is part of the scientific effort.
The failure of science as a body of knowledge includes faulty certification leading to misleading claims, and over-confident or misconceived claims. For a claim to become widely accepted, a community of enquirers needs to determine that it is true enough and important enough (Kitcher, p. 148). A further step is to specify what serves as adequate evidence, which needs to be transparent and understandable. For scientific consensus that has wide-ranging social implications, a range of scenarios that encompass best-case and worst-case outcomes can be applied. By discussion amongst the wider community of scholars, the ideal is a well-ordered system that is transparent and can be investigated by any individual and against which they can reflect their own values.

My view is that this can be undertaken using something like the methodical scientific research program proposed by Lakatos. It requires a strong epistemic process that develops levels of confidence in a progressing research program. Refutation and naive falsification cannot overturn theory, but methodological falsification that better explains core theory may do so, even if it means that related corollary theories are worse off.

This actually sets up very strong conditions for science, because it means that core theory held with high confidence is very unlikely to be overturned. Thus the barrier for the scientific pessimist is very high, and for the pessimistic inductionist, impossible (Roush 2009). Pessimistic arguments by induction include the science has been wrong before, the mainstream persecuted Galileo and he was right, these scientists co-author, therefore they are colluding and so on. None of these arguments survive logical or evidentiary claims.

The defence of core theory is a scientific solution that works best for areas where science has claims to objectivity (e.g., level 1 and 2 science) but does not fare so well when working up through the levels of engagement to courses of action and ethical engagement. However, disagreements at those levels should not be held to disallow Level 1 findings.

Kitcher suggests two main avenues for the public rejection of science are denial in the face of more strongly held belief systems (e.g., religion) and rejection of the science because of its consequences. That is, denial because of competing beliefs and denial of risk.

Just as there is separation of church and state in democracies, public knowledge needs to be secular and set aside from belief-driven systems. Well-ordered science will articulate the social, value-laden aims of fostering better communication and providing a clearer view of how judgements are made. For the latter, this would emphasise the important distinction between judging the data are good enough, given the envisaged consequences for human welfare, and accepting a conclusion because it will make you money or advance your favourite political cause (Kitcher, p164). It also needs to be shown how political values and self interest can play a role in distorting messages. The accommodation of religion and belief means having more widespread discussions about human needs, and how they may be served in both a religious and a secular way. In addition to the social contract of the common good, democratic society needs to have a shared notion of public knowledge. At present reason is not taught widely; schools and universities teach facts but not how they are obtained, or how we separate facts from belief, while needing both. How emotional and deductive thinking influence our reasoning is not usually discussed or taught, but is the centrepiece of understanding social construction and developing reflexive thinking.

The democratic ideal for science is where science is commenced in its epistemic framework but is shared democratically as public knowledge. Kitcher’s levels 2 to 4, which are essentially the various levels of a risk assessment framework, are explored as public knowledge. Research may be referred, but the ideal of transparency in values and how knowledge is certificated, remains in the public domain. One of the key junctures surrounds the question of urgency. His suggestion is that this be carried out using scenarios. For example:

  • Application postponed – immediate application of partial knowledge may not succeed. Further enquiry required. This is consistent with the First and Second Assessment Reports of the IPCC 1990–1995.
  • Consensus on urgency – there is insufficient public knowledge on how to act but the knowledge exists that not acting would have severe consequences.
  • Debate about urgency – where there is a range of opinions about the state of urgency, but general agreement that if urgency is confirmed, then action should follow.
  • Lack of access to pure knowledge – well-ordered science addresses enquiry but the answer cannot be understood by the wide majority of people.
  • Lack of access to a problem ideally postponed – more research would benefit a response but this is not understood by a majority of people.

To Kitcher’s list I would add Rejection of pure knowledge, which constitutes the rejection of epistemic specialisation if it comes up with the “wrong” conclusions.

My work on wicked risks suggests that all of these can co-exist, and Kitcher alludes to the same. In a degraded environment, where truth and falsehood are not decided on matters of evidence, risks become particularly wicked. The ability to market factoids using human psychology has become far more sophisticated than has the effort to communicate the knowledge project that science is involved in. Kitcher suggests inviting key citizens behind the scientific curtain so that they better understand the scientific endeavour. He also advocates a strongly revised curriculum in education. There could be less emphasis on the technical aspects of science for those not so inclined but a fuller emphasis on how science is used, its high points and how it can be translated into policy. How society applies knowledge, greater development of skill in reasoning and insights into how human psychology can be manipulated are all valuable areas of learning.

With respect to dissent, there is no single mainstream trusted source of public knowledge in modern democracies. Individual scientists and institutions vie for media space as news, and press releases often oversell new discoveries and their importance, leading to claims of scientific overconfidence. The media and readers are interested in controversy and conflict, rather than nuance and explanation. The development of science as public knowledge needs to be carefully considered, as none of these avenues are ideal. Some trends, such as the reduced profitability of the print media, are reducing the opportunity for in-depth discussion and reporting of important issues.

Diversity is important within science – a range of approaches may need to be taken to some scientific problems. Improbable but plausible ideas need to be tested. Dissent is useful too, if the values in play are transparent and arguments are backed with evidence. However, when the preferences expressed by dissenting citizens diverge from their real interests, that gap is generated through deception. For example, most people still believe that they should leave the world better off than they found it, and would wish not to leave future generations at risk of catastrophe. Yet Australians are being told in a campaign being run by one side of politics, business and industry and a large section of the mainstream media that a big new tax, referring to a price on carbon, will cost them significantly and do nothing for the future.

Latour cites the following editorial from the New York Times:

Most scientists believe that [global] warming is caused largely by manmade pollutants that require strict regulation. Mr. Luntz [a Republican strategist] seems to acknowledge as much when he says that “the scientific debate is closing against us.” His advice, however, is to emphasize that the evidence is not complete.

“Should the public come to believe that the scientific issues are settled,” he writes, “their views about global warming will change accordingly. Therefore, you need to continue to make the lack of scientific certainty a primary issue.” (NYT 15 March, 2003)

A Google Scholar search of the phrase “the science is settled” and the term “climate” suggests there have been 114 unique documents using this particular phrase since 1998, none before 1997, fewer than 10 per year before 2006 and averaging 22 per year for the past three years. 104 of these are from AGW opponents, most are noted denialists and the term has really only entered more mainstream publications over the past three years. This then, seems to be similar to the many other claims documented by Oreskes and Conway, where a false claim is made in order to shoot it down. Certainly the IPCC has never said it; no established scientist would make that claim about climate science itself and there is no evidence of any scientist saying so. More recently (since 2007) there have been several statements by scientists responding to that claim. Schneider et al. (2010) ask if the science is “settled” enough for policy, framing the question as one of risk, referring specifically to the scientific uncertainty and the values in making such an assessment, completely consistent with the framing of well-ordered science in Kitcher’s book.

This example shows that various groups who wish to preserve their own interests above the interests of broader society and of future generations have founded an alternative epistemic community, the agnotogenic community: a group dedicated to the spread of ignorance and confusion. Agnotogeny is the generation of ignorance and confusion and comes from agnotology, the study of ignorance. This community has constructed an alternative institutional structure of think tanks, journals and online blogs with the view of promoting an alternative scholarship to conventional science. They promulgate the story of scientific uncertainty, influencing balance as expressed by the general media in the guise of free and democratic speech.

Science after modernism has moved from the autocratic delivery of scientific knowledge to a democratic communication of its findings and of the potential consequences. This transition has adopted many of the critiques of post-modernism (encapsulated in Latour’s quote above) but is still incomplete. In particular, science’s underlying values need to be made more transparent. However, in moving towards a more democratic mode of discourse, science becomes vulnerable to the discourse of power disguised as individual freedom and the right to free speech. Any perceived challenge to this power that arises out of scientific consensus is interpreted by vulgar democracy as an attack on these values. Vulgar democracy is characterised by a free-for-all of competing values, where knowledge is interchangeable with belief. Evidence supporting the scientific consensus is continually challenged in a compliant to complicit media, and these challenges are continually reinforced by the anarchy of the internet. Science needs to defend its role in contributing to public knowledge but it cannot do so by the traditional recourse to value-free objectivism. This requires a vigorous defence of how scientific and democratic values align, and the development of a more well-ordered process for undertaking science.

Written by Roger Jones

September 28, 2011 at 11:55 pm

7 Responses

Subscribe to comments with RSS.

  1. Nice post Roger.

    I particularly like the use of Lakatos’ scientific framework – this has always seemed to me to be the closest philosophical description of how science really works. It frustrates me that educating people about philosophy of science (in high school etc) seems to have gotten stuck at a naive Popperian view as the gold standard, while simultaneously seeing a kind of Kuhnian idea of a scientific paradigms as (a) the way science gets done in practice but (b) as a decided second-best to Popper.

    Unfortunately this gives oxygen to many straw man arguments about uncertainty (e.g. ‘the science is settled’), since it allows people to believe that they intuitively understand science and how it works. Given that few scientists (let alone people in the public domain) will critically examine what values they actually employ when doing science, I agree that advertisement and exposition of these values via education seems to be a number one priority.

    That said, you identify some problems with how this is done at the moment:

    The media and readers are interested in controversy and conflict, rather than nuance and explanation. The development of science as public knowledge needs to be carefully considered, as none of these avenues are ideal. Some trends, such as the reduced profitability of the print media, are reducing the opportunity for in-depth discussion and reporting of important issues.

    I am particularly interested how you might go about effectively engaging with the general public. Do you think that scientists (and their minders, e.g. university PR departments) need to be more cautious in how they engage with the traditional media sources, or do you think there needs to be a complete rethink in how science is reported? Is it a problem with scientists, or a problem with the media?

    Perhaps Kircher’s last point, highlighting the need for “A process that steps through claims of consensus, consequences of those claims [etc]…” is the way forward, but how do you see such a process being operated, without falling victim to the discourse of power you highlight in your piece?

    Jess

    September 29, 2011 at 1:06 pm

  2. Thanks Jess,

    I think there are a number of steps:
    1. Convince the public that public knowledge is an asset that society needs. It can be advanced or degraded and is currently at risk because of rapid changes in communication and the reduced quality of public discussion. (As an aside, when looking at knowledge as a social capital, it’s hardly ever listed – education is, but public knowledge seems to be taken for granted. It has also been commodified to a degree, so is often not seen as a public good).
    2. Develop a broad set of probative values that set the ethical framework for science and champion it at a higher level than university codes of ethics – at the international level through ICSU or IAC.
    3. Science education as a liberal art, in that it teaches how science contributes to knowledge, its basic epistemology, its benefits and fundamental reasoning skills that at the very least teaches inductive and deductive skills. Also some Gruen science – how the mind takes short-cuts and what that means for one’s perceptions.
    4. Work harder to get knowledge from unis, labs and private researchers into the public sphere, through outreach and embedding people in research environments.
    5. Build a defence for public knowledge in the broader media. Work out how these principles in point 2 might be translated for the press enquiry within a self administered framework that involves a set of probative standards (scientific values setting out proofs that separate science from pseudoscience and real uncertainties from disproven assertions) devised by the scientific community but administered through a media council mechanism. Membership would be mandatory for the dead tree press, licensed broadcast press, and their online presences with an opt-in standard for independent on-line sources. A committee system of science and journalist representatives who sift through submissions from the public and only treat those with a substantive case, either on a by case basis and-or annual review to undertake a tracking function in how science is being developed as public knowledge via communication through the media.
    6. Continued work on developing a well-ordered science that integrates science and values as advocated by Kitcher. When I looked into Lakatos’ methods, there had not been many advances since. I then came across Kitcher’s and Rouse’s work to name two who I think have added a lot in the last decade. Postmodernism after Kuhn opened up the door to values in science, but I think it has gone too far, although the most substantive warnings about scientism, over confidence and ideology posturing as fact still exist. Science is no longer the worst example of this, however.
    7. Consider whether Kitcher’s Level’s 3 and 4 tasks should be carried out in public within the community, rather than in expert communities; i.e., the IPCC where expert assessments are handed back to community and debated in the media. This suggests that when it comes to the development of ethical frameworks and the balance of current with future actions, discussions are held out in the open amongst peers. In my ideal world, society would decide this, then task politicians to enable it. That ain’t gonna happen, but it’s nice to think about.

    Roger Jones

    September 29, 2011 at 3:22 pm

    • Roger: I like your point number 5 – it would be nice to see both a punitive council for bad reporting of science, but also an acknowledgement when science is reported well.

      Regarding Lakatos vs Popper etc – Alan Musgrave (prof. em. at Otago Uni) might be someone to look up. He was co-editor with Lakatos in “Criticism and the Growth of Knowledge”, which is a fantastic collection of essays and well worth reading.

      Jess

      October 5, 2011 at 12:19 pm

      • Jess: The Lakatos-Musgrave volume is the one I draw from most because of the rich interplay between the debaters. I think Masterman’s contribution is especially important because she pointed out stuff in Kuhn’s use of paradigms that he hadn’t recognised. Feyerabend’s objection to brutish science is mainly aesthetic rather than philosophical! Because of complex systems, which weren’t that much in evidence at the time, his fears cannot be realised – there’s plenty of room for creativity for a long time to come, Wolfram nothwithstanding.

        For some time I was unable to find a subsequent work to the 1970 volume “Criticism and the Growth of Knowledge” that moved the agenda forward, but I now think that Kitcher’s work does that. My thinking is that developing some formalism from Lakatos and maybe Roush and others, and combining that with Kitcher’s science-values framework is the way to go. It fits into my work on complex risks really well. I have a table of various degrees of falsification-validation that is as yet unpublished and am planning a paper on climate science that uses it (have long drafts exploring these issues that I’ve mined for other works, but not that yet).

        Roger Jones

        October 5, 2011 at 11:25 pm

      • Well I’d be really interested to read your paper when it’s ready to see the light of day!

        I agree that the agenda hasn’t really moved forward with regards to concrete applications of philosophy of science to scientific philosophy. Most of the discourse moved sideways into fairly abstract metaphysical considerations (Feyerabend being the prime example I think) without trying to tie things back to real science.

        Thanks for the links to Kitcher’s work though. I shall have to have more of a look once my thesis is finished!

        Jess

        October 6, 2011 at 8:13 am

  3. Brilliant post Roger
    Followed you here from LP link

    Science (or at least the technological by-products of it) now infuses all our lives enormously, and it has also provided amazing understanding and insight into how and why we have come to exist and survive. Though so much remains to be investigated.

    I’m a bit stunned at how ignorant many are of the literally centuries of conjecture and experiment that lies behind it all, from ipads to foodstuffs. Without even mentioning the resources required to manifest it all.

    In a way I think discussing the science behind very everyday things that almost everyone has some contact with is a way to go. I actually think it might make people realise how on many levels we understand only a fraction of the natural world on a scientific basis. If they started thinking and asking a few questions.

    Though for natural systems, as I think you allude, we are much more likely to know enough to know when things are looking really bad than ever providing some sort of precise long term ‘answer’. Natural systems and living things, including us, are for ever dynamic.

    Quoll

    September 29, 2011 at 7:26 pm

    • Quoll,
      sorry for not approving your post until today – didn’t show in the queue earlier for some reason and thanks for the comment

      Roger Jones

      October 5, 2011 at 7:18 am


Leave a comment