• No results found

Sustainable science policy - in whom shall we trust?

N/A
N/A
Protected

Academic year: 2021

Share "Sustainable science policy - in whom shall we trust?"

Copied!
19
0
0

Loading.... (view fulltext now)

Full text

(1)

© Bengt Arne Fagerström 2017

Cover: Arne Fagerström and Gary Cunningham Published by:

Atremi AB Axstad Södergård

SE-595 94 Mjölby, Sweden E-mail: info@atremi.se www.atremi.se

Printed by AS Printon Trükikoda, Tallin, Estland 2014 ISBN 978-91-7527-174-3

(2)

A Good Life for All: essays on sustainability

Table of Contents

Sustainability has many faces – editors’ overview. . . 11

Chapter 1 Sustainability in the University of Gävle for a Sustainable Future. . . 13

Chapter 2. University sustainability identity – the role of identity, image and reputation. . . 19

Chapter 3. Sustainable science policy – in whom shall we trust? . . . 31

Chapter 4. A social work perspective on health, wellbeing and sustainability . . . 55

Chapter 5. Eco-social work for sustainable development. . . 81

Chapter 6. Building materials are important for sustainable development . . . 105

Chapter 7. Online corporate social responsibility reporting . . . 113

Chapter 8. Sustainable enterprise theory. . . 133

Chapter 9. Sustainability of world heritage. . . 151

Chapter 10. Accounting for sustainability indicators. . . 172

(3)

Chapter Three

________________________________________________________________

Sustainable science policy – in whom shall we trust?

_________________________________________________________________

Mikael Björling1

All speak in favor of honest, independent science,

even those who seek to subvert it. (Greenberg 2008 cited in Widmalm 2013)

Abstract

The role of science as a provider of well-tested and trustworthy knowledge in a sustainable society is discussed with respect to other social institutions: government and state bureaucracy, the market, the media and the public. In particular, societal pressures that threaten scientific endeavour are problematised with a slight bias towards examples from sustainability science.

Global challenges of today transcend social institutions and therefore require novel modes of trans-institutional cooperation. Such modes of cooperation may produce unwanted clashes of institutional norms that imperil objective and mutually accepted norms that need to be developed. The science policy for a democratic, sustainable society should strive for clear divisions among social institutions while encouraging suitable modes of cooperation to address global challenges to the sustainability of human life.

Introduction

There is an inherent conflict in the notion that science provides guidance for fundamentally political decisions. “Thank you Professor, we are sort of grateful that you alerted us to the risks, but now tell us what we should do to avoid them” (paraphrased from Ziman 2007, p. 256). The discord is accentuated when politicians and others frame scientific questions or when public servants claim that their compilation of contemporary knowledge represents the scientific viewpoint. The conflict raises questions about the scientific endeavour itself; how science transforms under societal pressures; how science is communicated and the vulnerability of science to stakeholder interests and outright disinformation. Ultimately, the trust in science is at stake.

The concept of sustainability begs the question: “Sustainability of what and for whom?” The contemporary plethora of answers to that question has become a liability for discussions on sustainability because the meaning of the word is muddled. The vision of the University of Gävle suggests that we aim for a “sustainable human living environment” (University of Gävle 2016). It believes that science plays an important role in developing and maintaining a sustainable society for human life. Whatever that role may be, it is of paramount importance that the results of science are trustworthy.

The following discussion circles around how science policy is formulated in a sustainable society. It starts by problematising the idea of public trust in social institutions, particularly science institutions. Then measurements of public trust in science and other social institutions and reasons for loss of public trust are discussed. After that, the nature of science is discussed from a normative perspective in which the aim of the norms to increase trustworthiness is emphasised. An overview, divided into the social institutions of government and state

1 Mikael Björling is associate professor of chemistry at the University of Gävle

(4)

bureaucracy, the market, and the public is then presented from the perspective of societal roles of science and corresponding societal pressures on it. Illustrative examples are given with a bias towards issues that arise in sustainability science. Media as a source of public access to scientific results and how science can help in the future are then discussed.

The conclusion suggests that a sustainable science policy ensures that science continues to produce new and well-tested knowledge that people have reasons to trust, but also that this knowledge is usefully applied. This goal entails that a sustainable society should strive for clearer divisions among science, state bureaucracy, politics and the market to avoid counterproductive clashes of norms. On the other hand, science policy should also encourage suitable forms of cooperation among these social institutions to efficiently address global challenges that imperil human life.

Public trust in social institutions

Public trust in the institutions of society greases the machinery of a democracy, but the public should also show a healthy skepticism towards them (Nahapiet and Ghoshal 1998; Ziman 2007). If people believe that taxes are properly spent, they are presumably more willing to contribute. If a risk assessment is credible and understood, a greater proportion of the public follows regulations imposed to avoid that risk. While trust is a complex concept, it is sufficient to note that the feeling of trust implies some level of interdependence between people and an institution and a willingness to take the risk of relying on it. The latter is typically reduced by the emergence of norms and laws (Rosseau et al. 1998; Kramer 1999; Ziman 2007).

Science has coevolved with democracy and consequently shares many of its values, but has diverged into a social institution with its own set of rules and norms aiming to produce well- tested and reliable knowledge. The word science (“wissenschaft” in German) is used in its more inclusive meaning in which natural science is devoted to investigations of the natural world and human science studies social society. The belief in the existence of universal scientific ‘truths’, particularly in natural science, prompted modernists to put science as the norm for society (Ziman 2007). Knowledge itself is amoral and had unforeseen outcomes. The resulting

‘technocratic’ societies turned out to be susceptible to non-democratic forces and often ended in totalitarian nightmares to the dismay of the public (Ziman 2007). The systematic extermination of supposedly inferior breeds of humans in the name of genetic hygiene and the equally unethical medical experiments in the concentration camps are particularly gruesome examples that gave food for thought.

The post-modern reaction was to advocate a pluralistic, multimodal society in which several public viewpoints can coexist and to challenge the idea of universal scientific ‘truths’ (Ziman 2007). Post-modern critique has its roots in disputing the reliability of scientific ‘truths’ in the self-referential human science. How is production of knowledge affected when human beings study other human beings in a social context of interaction? One obvious problem is that the outcomes of investigations and experiments depend on the fabric of the social context itself.

For example, studying reasons for public trust in social institutions in a western society or in an isolated tribe of Papua-New Guinea would yield very different results. The post-modern expression for this dependence is that knowledge is “socially constructed” (Berger and Luckmann 1967; Searle 1995; Ziman 2007). It became increasingly evident that all human activities, including science, contain an element of social construction. How natural scientists formulate their scientific ‘truths’ is a “social construct” and therefore can be challenged from this viewpoint (Ziman 2007). A healthy skepticism towards social institutions, including science, is therefore sanctioned in post-modern society. On the other hand, to be equally

(5)

skeptical of the natural world appears farfetched (Herron 2008; Soper 1995). Social construction is intimately connected to human activity. There is ample evidence that the natural world precedes human beings and will remain even if the human race goes extinct.

Measuring public trust

Public trust is an indicator of the performance of democratic societies and is often monitored annually. In Sweden, academic scientists top the list with 84 percent of those polled expressing moderately high or very high confidence levels in their results. The corresponding figures for scientists in research institutes or in industry are 81 percent and 53 percent, respectively. Health is the single issue that engages the most, 47 percent, trailed by other issues, e.g. energy, 19 percent; climate, 17 percent; human science including culture, 17 percent (Vetenskap and Allmänhet 2016). Another study demonstrates that confidence is higher in natural science, approximately 66 percent, than in human science, approximately 41 percent, but during the last decade there has been a marginal decrease in trust in science, particularly natural science (Bergström and Oscarsson 2014). Public confidence in universities has been stable, with small oscillations in the range of 50 to 60 percent (Bergström and Oscarsson 2014). These figures can be compared with confidence levels in Sweden for national and local government, 53 percent and 57 percent (Fitzgerald and Wolak 2016). In the USA, confidence levels in scientists are much lower, approximately 40 percent (National Science Board 2016; Gauchat 2011). This result in the U.S. is not very much higher than the trust in the U.S. federal government, 20 to 30 percent, and less than the military (Chanley, Rudolph and Rahn 2000; National Science Board 2016).

Proximity, e.g. understanding issues and public engagement, are among the factors that are commonly cited to have positive correlations with trust. The level of education shows a strong positive correlation with the confidence in science and approximately 80 percent of the population in Sweden believes science and technology have improved conditions of life (Vetenskap and Allmänhet 2016). Trust, however, is very complex and these factors are only part of the multimodal story (Kramer 1999; Bergström and Oscarsson 2014; Fitzgerald and Wolak 2016).

Trust can also be seen as a commodity. Agents seek association with highly trusted institutions, e.g. science, because such association implicitly increases trust in them. A tell-tale example is that confidence is higher in science reporters, 61 percent, than in news reporters, 39 percent (Vetenskap and Allmänhet 2016).

Scandals decrease public trust

Trust is painstakingly earned by building social capital, but is brittle and easily shattered. Media coverage of misbehaving scientists decreases public trust in science, at least temporarily. Media reports generally focus on issues that engage the public, i.e. health; other issues get less attention (Andersson 2014). Breaking explicit or implicit ethical guidelines of ‘scientific conduct’ is normally handled by bureaucratic institutions within, or close to, the science community.

Sanctions vary from suspended employment and shattered scientific careers to ordered supervision or warnings. Fabricating research data rarely enters the public judicial system, but there are some notable exceptions. In 2006 a scientist was sentenced to prison for two years and ordered to pay back grant money for lying in a grant application (Dalton 2006; Interlandi 2006).

In 2015, another scientist was sentenced to nearly five years in prison and ordered to pay back US$7 million of grant money on the same charge (Retraction Watch 2016).

(6)

Exposing the public to risks is a serious ethical breach, both in science and in other democratic institutions, that is likely to engage people and lead to trust-damaging media coverage. A recent example is the Flint, Michigan, USA, water crisis in which municipal and state institutions failed to appreciate risks involved when community water supply was changed. As a result, unsuspecting citizens were exposed to unhealthy levels of lead (Scully 2016). Public trust also decreases when risks are seemingly exaggerated, e.g. the Mad Cow Disease (Ziman 2007) Whistleblowers occasionally give access to otherwise secret information on the actions of individuals in social institutions. The Panama Papers represented a huge leak that has already revealed shady financial dealings of officials and criminals that decrease public trust in social institutions (ICIJ 2016). The amount of damage to the public trust caused by norm breaking is intimately connected to the public verdict on the perceived reaction of the concerned institution in response to the breach.

Norms of science

Scientists who adhere to the norms of science are implicitly more trustworthy. The norms of science are mostly tacit and have developed into a code of scientific conduct with the aim to provide well-tested and reliable new knowledge. Starting with Robert Merton, several researchers have attempted to capture the normative structure of science (Merton 1942;

Anderson et al. 2010). Scientific norms, however, continuously evolve in response to societal forces, e.g. post-modern criticism, and any list of norms cannot be considered exhaustive (Anderson et al. 2010). Nevertheless, listing norms provides a starting point for further discussion and studies (Macfarlane and Cheng 2008). Norms do not necessarily reflect the actual state of affairs. They are more like desirable goals. The importance scientists attribute to a particular norm can be assessed by their reactions to breaches of that norm (Anderson et al.

2010).

Science is pluralistic

Science has adapted well to the post-modern idea of a pluralistic society because it captures the very soul of scientific activity. In the words of Max Planck (1936): “New scientific ideas never spring from a communal body, however organized, but rather from the head of an individually inspired researcher…”. All scientists are expected to present their own accounts of the subject, their own interpretations and propose their own solutions (Ziman 2007). To give room for new ideas, intellectual pursuits should ideally be uninhibited. However, the amorality of this quest for knowledge requires ethical guidelines to become socially acceptable. In pluralistic and democratic societies, ideas are not suppressed and are permitted to coexist. Science ideally takes all viewpoints into account and agglomerates them into well-tested knowledge in processes of organized skepticism (Ziman 2007). The battle of ideas is performed with arguments and not by force. Science is not adversarial in the sense that a battle results in a decisive outcome, YES or NO, but scientific ideas must be supported by empirical evidence. Debates may be heated, but consensus is not enforced. Bystanders draw their own conclusions and opponents typically go back to the drawing board to refine their arguments in preparation for the next debate.

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather its opponents eventually die, and a new generation grows up that is familiar with it (Planck 1950).

Indeed, a research field where there is consensus is the one more likely to be moribund, or in the grip of an unhealthy intellectual fashion! (Ziman, 2007 p. 328)

Transparency and open availability of new ideas to enable scrutiny is crucial to the reliability of the agglomeration process. Ideas that never reach the community due to some type of

(7)

suppression could be the most important ones that would survive the steel-bath of time and lead to new paradigms. Freedom of thought is therefore considered one of the most important norms of science (Hasselberg 2007). Autonomy is a word that is often used for this norm. The counter norm, heteronomy, is restraint of thought by external pressures. Tell-tale signs of heteronomy conditions are when external pressure: i) forces scientists to change political opinions; ii) imposes issues, methods and theories; iii) biases research to avoid politically sensitive fields;

iv) imposes consensus and rejects conflicting ideas (Blomquist 1992 cited by Hasselberg 2007).

In good universities there is a place for ‘hopeful heretics, persistent provocateurs, dreamy dissidents and stubborn skeptics’. They harbor the seeds of new paradigms, albeit with a low probability. Diligent conformists only produce new knowledge within the contemporary paradigm (Ziman 2007 p. 328).

Science is uncertain

The requirement of empirical verification introduces uncertainty in two ways: one arises from the natural variation of data and is easily handled by good statistical analysis. The other is fundamentally coupled to induction. The latter implies that we can only be 100 percent certain of the outcomes of past experiments, but the outcomes of future experiments are more or less unknown. By structuring the knowledge of past observations into general models, scientists can make fairly good predictions of future outcomes in a particular case. However, no certainty can be obtained until a prediction is tested empirically. General models also involve approximations of reality and thus have a limited range of validity. Predictions are contingent on whether the model is applicable or not. Furthermore, how general models are formulated is not incontestable because they are ‘social constructs’ within the paradigm.

Well-tested knowledge comes in a hierarchy of credibility. Credence increases with the amount of scrutiny. When scientific knowledge is assessed, it is important to be critical of the source and distinguish among: (1) An established scientific ‘truth’; (2) the near consensus of a research community; (3) a finding published in a peer-reviewed journal; (4) the considered opinion of a recognised expert; and (5) the say-so of just any professional ‘scientist’ (Ziman 2007 p. 298).

Conflict of interest

Scientists are expected to give true and honest accounts of their research observations. Anything else is considered a nuisance leading to wasted resources in the community. Even if false and biased information eventually is weeded out, science can be temporarily misled due to its pluralistic nature. The time to correct errors can be relatively long because someone must discover them.

Many of the norms and checklists in science aim to reduce the effect of flawed observation skills. Scientists are human beings and biased by default. It’s not a question whether to be or not to be biased. It is more a question of openly declaring conflicting interests that may lead to biased accounts. This is an important parameter in source criticism. Industrial researchers may not disclose data that harm their employer. The public is clearly aware of this and have lower trust in them than in academic researchers. If expert witnesses in trials have conflicting interests, they may subvert the judicial system. If such experts are called in by the state bureaucracy, they may subvert democratic decisions and if they get media coverage they may subvert public opinion.

Falsifying data is a cardinal sin, but one particularly nasty kind is ‘doubt-mongering’, i.e. to deliberately plant disinformation with the aim to subvert (Oreskes and Conway 2010; Oreskes

(8)

2015). It is now clear that the tobacco industry for over fifty years deliberately created an impression of a scientific debate about the harm of tobacco by setting up “alternative journals and encouraging or paying scientists to publish in them” (Oreskes 2015). This disinformation coupled with persuasive pressure on journalists and politicians enabled them to delay efforts to control tobacco use and protect public health (Oreskes 2015). Evidence of doubt-mongering has been demonstrated in several instances, both before and after tobacco, and seems to be a common and effective strategy when stakeholders aim to challenge scientific research (Oreskes and Conway 2010; Oreskes 2015). At present, environmental issues that increasingly affect political and administrative decisions are targets of doubt-mongering. Scientists learn more from contemporary research in science history, but it is difficult to avoid these ‘Trojan horses’.

At the moment, the defense strategy appears to be more research. Eventually, a few perpetrators may face criminal charges and be convicted like those in the tobacco industry (Oreskes 2015).

Science for the government and the state bureaucracy

In a democratic society, the role of the government and state bureaucracy is to serve and act in the best interest of the public (OECD 2003). In a similar fashion, science supposedly caters to the same interests. The resources for science have always been dependent on allocations from external actors, e.g. students, altruistic benefactors, government, or industry (Blomquist 1992).

Science grew out of a scholastic aim to produce skilled and educated citizens and was primarily financed by student fees. It was natural that resources increasingly should come from public funding to be available for all gifted students. While public spending for the scholastic mission of science can be justified, scientific research and the search for new knowledge have held an ambivalent position in civil society (Blomquist 1992; Ziman 2007). On one hand, allocating resources to research may be justified if it is directed to solve practical problems perceived by the state or the market. On the other hand, it may be justified if the outcome is new, reliable, well-tested knowledge that may be useful now or in the future. The balance between these two justifications of scientific research has oscillated over time (Blomquist 1992; Ziman 2007). In Sweden, the former was in vogue in the 18th and 19th centuries and is increasingly so at present (Blomquist 1992). In the 20th century, especially after the Second World War, the latter was preferred (Ziman 2007). The choice of justification obviously affects government science policy and it is important to understand that the two justifications push the development of science in different directions.

The latter justification is more true to the scientific norm of autonomy developed for efficient production of new and reliable knowledge over the long term as sketched above. Letting external stakeholders frame scientific issues produces conflicts of interests and is known to produce dead ends of no lasting value. One example is the ideologically framed genetic research performed by Lysenko (Ziman 2007). Another is the glorious historical past of Sweden commissioned in the formation of the Swedish nationalistic project (Jarrick 2013).

Interestingly, a similar need for glorious historical pasts now appears to arise in the newly reinstated nations of eastern Europe, e.g. Poland (The Economist 2016) and Lithuania (Ruin 2016). Nevertheless, there are also ‘success’ stories where concerted efforts achieved the desired political goal, e.g. The Manhattan Project and others (Mauser et al. 2013). Another type of stakeholder influence, lobbying, was raised by Watts and Zimmerman in their “Market for Excuses” paper in which they problematise the tight relationship between legislation and accounting theory, although they later found their approach “less productive” (Watts and Zimmerman 1978 and 1990).

Public resources are limited and fiscal policies are important tools for allocating funds. The economic value of assets and investments becomes a central issue. Non-economic values are

(9)

usually left out of the calculations because uncertainties of any assigned economic values are large. In the fiscal calculus, investments in the ‘science market’, do not ‘add up’ unless economic returns are expected somewhere in society (Budtz Peterson and Hendricks 2014). If new scientific knowledge leads to innovations that can be bought and sold, it seems the palatable answer for politicians is that the market itself is the likely candidate to yield the required revenue. In this story, funding of scientific research becomes a joint venture between state and market and responsibility transgresses institutional boundaries. Making science more useful and developing innovations in cooperation with industry is viewed as “increas[ing]

competitiveness and create[ing] more jobs in a global knowledge economy”, as stated in the present innovation policy in Sweden (Regeringskansliet 2016). However, as Waluszewski (2013) points out, this story conflicts with two important norms in science, the free availability of scientific results and no conflicting interests. National funding in science with an aim to develop nationally commercialised innovations implies some secrecy of scientific results and that scientists who cooperate with industry are not free of conflicting interests.

The fiscal calculus also misses many valued non-economic outcomes of science (Ziman 2007).

In the above Swedish innovation strategy, science is considered useful to “meet global societal challenges”, and “deliver public services with increased quality and efficiency”

(Regeringskansliet 2016). Decisions in public service are expected to be informed and without conflicting interest so science can play the role as a provider of disinterested and well-tested knowledge or expert witnesses (Ziman 2007). However, the pluralistic nature of science is not conducive to enforced decisions. Assessments made by scientists are seldom unanimous.

Weighing all the evidence together and reaching a decision remains a task of public servants who are skilled in these matters. Nevertheless, a bit of scientific schooling and source criticism is helpful in the work of law officials, politicians, and other public servants.

Global challenges in the above innovation strategy allude to environmental and demographic global risks that societies face. Risk handling has become one of the governmental duties in post-modern society (Ravetz 1977; Ziman 2007). The role of science is once again to provide well-tested knowledge of risks that is free from conflicting interests. The role of government is to weigh different aspects of risk, e.g. consequences of risk and societal costs to diminish risk, and decide on adequate regulations to reduce risk to acceptable levels (Ravetz 1977). Many global risks are complex issues that transgress boundaries of the scientific disciplines. Cross- disciplinary studies are not new to science, but the scale is new. Science must find the means to cope with the challenges of large-scale cross-disciplinary studies. Note that the Intergovernmental Panel on Climate Change (IPCC) is an example of tasks that are transgressing institutional boundaries (IPCC 2016). Their task is assessment of scientific results like that of state bureaucracy and they seek immediate consensus in their reports as in the political arena.

The increasing importance of fiscal policies in the last decades has paved the way for New Public Management (NPM) in state bureaucracies. NPM was a reaction to the perceived inefficiency of state bureaucracy and proposed to introduce elements of governance from the market to improve efficiency and to produce results. NPM reversed the doctrines of democratic accountability, i.e. separating the public sector from the private sector, and upholding an elaborate set of procedural norms to prevent misuse of public resources and replaced them with accountability of results and a ‘corporatisation’ of state bureaucracy (Hood 1995). Reversing procedural norms of democratic societies, some of which still remain in national laws, is not entirely straightforward and raises a number of moral issues concerning science (Rider et al.

2013). Especially because NPM also influences power structures of universities and affects

(10)

research from within them (Rider and Jörnesten 2007; Rider et al. 2013). NPM is in fact not well adapted to governing the process of science because fiscal bureaucracies favour routine, not risk. NPM stresses deliverables and not development of new knowledge (Ziman 2007).

Lately, the theoretical basis for NPM has also been challenged from within fiscal and economic science. The central concept of rational, self-interested agents, as well as efficiency of contractual agreements, has been questioned (e.g. Kahneman 2012; Almqvist and Wällstedt 2013). NPM has also been accused of creating anxious organizations (Almqvist, Catasús and Wällstedt 2013).

Science for the market

With the emergence of NPM creating a transgression between state and market institutions, as well as complex global challenges requiring cross-disciplinary studies transgressing traditional scientific disciplines, an evolution of science into a new mode of knowledge production, ‘Mode 2 science’, as opposed to traditional ‘Mode 1’ science, was predicted (Gibbons et al 1994;

Gibbons 2000; Nowotny, Scott and Gibbons 2003). These researchers contended that ‘Mode 2’

was characterised by context-sensitive science, much like applied research performed in industry, responding to pressing global challenges identified by the whole society and thus legitimising allocation of resources to scientific research. The traditional organization of scientific institutions into disciplines is deemed inadequate and more loosely held project-based structures are proposed, in which cross-disciplinary teams from all institutions in society are selected for cooperation on a specific task. After a task is accomplished, teams reorganize into new teams for other specific tasks. These ideas sparked a lively debate concerning benefits and drawbacks of ‘Mode 2’ knowledge production (e.g. Ziman 1996; Ziman 2003, 2007; Rider and Jörnesten 2007; Rider et al. 2013). One particular drawback of ‘Mode 2’ accommodation of science to the market is a potential loss of trust in the knowledge produced (Ziman 1996). When scientists come to depend on projects for their livelihood, they probably fail to be unbiased.

“Those who pay the pipers call the tunes” (Ziman 2007 p. 333). A subsequent deprofessionalisation of scientific research and an erosion of important scientific norms is predicted (Hasselberg 2007; Rider and Jörnesten 2007; Teelken 2012). Nevertheless, elements of ‘Mode 2’ research are currently proposed as necessary developments towards more

‘effective’ sustainability science. Most researchers focus on challenges of trans-institutional cooperation in the production of ‘relevant’ knowledge (Brandt et al. 2013; Mauser et al. 2013;

Harris and Lyon 2013). Others focus on the lack of transformative power leading to social changes in the direction of sustainability (e.g. Popa, Guillermin and Dedeuwaerdere 2015).

While the latter is a serious political problem, it is highly questionable whether it should be the mission for sustainability science itself.

Addressing contemporary global challenges leads to business opportunities and provides incentives for market involvement in applied research funding. Public, state and market interests exert pressures on scientists to produce knowledge that is ‘relevant’ for dealing with the global challenges. Sharing challenges, as in ‘Mode 2’, is not enough for a successful joint venture. It has been shown that “trust is vital when crossing professional cultural boundaries as people are opening themselves to vulnerability and risk”. Important factors in building trust in these types of collaborations are “information on others, prior experience of working together, norms of cooperation, and sanctions exerted on those who might transgress norms of behavior”

(Harris and Lyon 2013 p. 109). For cooperation within ‘Mode 1’ science alone, all these components are mostly in place. For successful trans-institutional cooperation, mutually accepted norms need to be developed.

(11)

Another lesson to be learnt from the perspective of knowledge economy’ is the risk of creating

‘science bubbles. Bubbles not only form in the financial market, “they also inflate in other sectors where large-scale investments are taking place, including research funding.” (Budtz Pedersen and Hendricks 2014 p. 504). Knowledge in vogue in the scientific field and the trust in knowledge are assets in this analogy. The role of speculators is played both by researchers and research policy-makers.

With the increasing use of monetary incentives and financial rewards (bonuses, points, rankings, and performance indicators), science may turn into an optimisation game driven by a credit-seeking motive rather than the truth-tracking mechanisms traditionally associated with scientific inquiry. Increasingly, phenomena like scientific hype, fashionable fields, biases against publishing negative results, and pressure to publish are recorded in the literature. … Assessing the toxic intellectual debt that builds up when too much liquidity is concentrated on too few assets is an important task if research funders want to avoid going short on overvalued research. (Budtz Pedersen and Hendricks 2014 p. 506).

When a ‘science bubble’ bursts, in analogy with a bubble in the financial sector, assets lose much of their value. This is certainly food for thought for contemporary science policy makers.

The present trend in Swedish innovation strategy is to focus larger parts of the research funding to ‘world-class’ scientists or research institutions, i.e. to concentrate more liquidity on fewer assets (Regeringskansliet 2016; Widmalm 2013). The simple solution to counteract ‘bubbles’

is the same as in the financial market: do not put all the eggs in the same basket!

Science for the public

Scientific knowledge enters the life-world of the public from the outside (Ziman 2007). It must be communicated or learned. More critically, to understand science one must use a rational mode of thinking, but it is often demonstrated that human beings prefer to use intuitive rules rather than rational thinking (Kahneman 2012). Furthermore, human beings tend to be unwilling to change their world-view, even though it may be challenged by new experiences (e.g.

Vosniadou 2008; Björling 2012). These are barriers that must be overcome to empower the public with reliable scientific knowledge (Hagendijk 2004). People need to form their own opinions on issues that concern them and this increasingly entails assessing scientific arguments. Paradoxically, scientific schooling seems to increase both healthy skepticism of and confidence in scientific results and arguments (Vetenskap and Allmänhet 2016).

It is in the public interest that scientific knowledge be trustworthy and openly accessible.

However, the power of the people is one-step behind established institutions from a resource perspective. As described above, scientific knowledge is expressed as general principles that aid the formation of hypotheses on specific issues, but in the end every hypothesis needs to be scientifically tested to become a valid scientific argument in debate (Ziman 2007). This need is obviously a disadvantage when public or non-governmental organizations (NGOs) want to challenge the state or market on issues for which scientific arguments are deemed important.

The resource gap could be decreased either by NGO funding of scientific research or by changing the scientific agenda of the state, and perhaps also of the market, through public influence (Hagendijk 2004). In both cases, it is crucial that the level of public confidence in the knowledge produced is upheld.

In order to ‘comply’ with EU regulations for Particulate Matter (PM) levels, the governing body of the city Florence temporarily suspended all measurements of PM2.5 (PM with size < 2.5 µm) levels in the city, with the exception of two green parks (Tallacchini 2016). Public confidence in officially reported levels of PM2.5 declined and sparked a NGO-funded citizen science project to independently measure PM2.5 levels in the city of Florence (PM2.5 Firenze 2016; Tallacchini 2016).

(12)

Empowerment of people by citizen science is intimately linked to their trust in democratic institutions. They must believe that it is worth the effort and that their results will matter;

otherwise they may turn to non-democratic means to change society. In this case, scientific arguments become irrelevant. The risk of violent public discontent increases with decreasing trust. There are some warning signs: concentration of capital in a few individuals; increasing evidence of misbehaving public officials; increased risk of ‘science bubbles’; and decreasing trust in the market (e.g. Piketty 2014; ICIJ 2016; Budtz Pedersen and Hendricks 2014; Trope and Ressler 2016). Public trust in democratic societies and in science falls dramatically if people feel that they have been ‘short-changed’ (Ziman 2007). While sustainability is theoretically possible in totalitarian states, their present track records point in the opposite direction.

Media reporting

Science is not consensus on Yes or No. It is neither and sometimes both. Pluralistic and uncertain scientific knowledge is not well-suited to the contemporary adversarial style of news media (Ziman 2007). With a few notable exceptions, neither are scientists. On the other hand, news media play an important role to introduce scientific topics in the daily agenda of the public. Media specifically about science also play an important role, but cater to an already interested audience.

Scientists often lament the factual accuracy of science reporting in the news and fear that it may cause a drop in public trust of science, but there is no dramatic drop, just a marginal decrease in the public confidence of science (Vetenskap and Allmänhet 2016). Perhaps the damage is already done. It could be that news media errors in science reporting contribute more to confusion and distrust for people who are less trained in science. There are two common types of errors in news media: reporting a correlation between variables as a causal effect and oversimplification of complex issues. A lot of good science, especially in exceedingly complex fields, is done seeking correlations. Scientists are usually well aware that these variable correlations may be the result of a multitude of causes. In particular, it may be the case that there is an unrevealed common cause to the observed variation of the variables. Correlations represent a weaker form of scientific knowledge than scientifically established causal relationships, but this distinction is often lost in news media reporting. The first scientific reports, which are picked up in media, are usually smaller studies whose results may be challenged by larger and more carefully performed studies. The typical reporting sequence goes as follows: Scientists find that eating X is good for your health; eating X does not have any effect on your health; eating X is bad for some people’s health, but may be good for others because …. Obviously, this type of media reporting does not foster trust in scientific results, especially not if readers lack scientific knowledge to see the pattern. While some simplifications are necessary in science communication, oversimplification leads to loss of essence and coherence.

The changing media landscape, with its ever-increasing information flow, probably leads to more media errors in science. In the economically strained media market at present, scrutiny of scientific news before publication appears to have become too costly. It is easier for the public and others to obtain the information, but more important to assess its quality. Perhaps a market will form for trusted sources of information in which the open access to scientific knowledge could be havens for the knowledgeable public. On the other hand, easy access to information makes it simpler for stakeholders to use disinformation and doubt-mongering to influence public debate (Fetzer 2004; Oreskes and Conway 2010; Oreskes 2015). Anderson (2008) predicts that the information deluge will pave the way for ‘Big Data’, but differing data quality

(13)

will be difficult to handle (Boyd and Crawford 2012). Even though there is progress in this field, it is still reminiscent of the old dictum, ‘Garbage in, garbage out’.

Science communication is another trans-institutional field in which objectives diverge: the state wants to inform citizens and to motivate state decisions; media want news that engages the public; the market wants to keep business secrets and promote products; scientists want publications with new scientific knowledge. These conflicts of interests may lead to problems that compromise generation of new and reliable knowledge. In sustainability science, for example, there is a growing concern about ‘messages’ that scientists are conveying (Somerville and Hassol 2011). This concern is clearly an important issue when scientists act as advisors for state decisions. Because decisions, as well as public participation, involve both facts and values, Dietz (2013) suggests that science communication should competently address both in order to inform decisions. Others propose that scientists employ strategies to invoke trust in public debate (Goodwin and Daelstrom 2013). The IPCC should “use all best endeavours to reach consensus” (IPCC 2016). These ideas are not problematic in relation to the objectives of the state, but they conflict with several scientific norms. They are certainly problematic if reports of scientific results, i.e. within the scientific community, are influenced by them. There are in fact some signs that this may be the case (Bray and von Storch 2014). Some researchers also suggest that climate change predictions are “erring on the side of least drama” (Brysse et al.

2013).

Science for the future

Humanity must adapt to, or mitigate, future threats to sustainability of human life.

Paleontological records show that ability to adapt to new living conditions is very important for survival of a species. On the other hand, human beings understand the concept of future and should be able to respond to alternative predictions of future scenarios. One important evolutionary advantage of human beings is developed at the age of five years: the ability to wait for a future and greater reward (Gopnik and Seiver 2009). Animals and younger children cannot resist the quick reward of eating a cookie now, even if they are promised more cookies if they could wait a while. Long-term thinking is our single most important ability to help attain sustainability.

While fiscal policies are important for sustainability in the short term, they appear myopic in the long-term perspective of sustainability. They certainly hinder the use of human evolutionary advantage. In a sustainable science policy, corresponding fiscal policies are tempered by a long- term perspective and non-economic values of science.

Long-term thinking implies making rational decisions based on forecasts of the future. The main problem is that forecasting is difficult because it must be based on what we know today.

By accepting that the validity of forecasts is based on the models, a critical appraisal of future predictions is facilitated. ‘Good’ models come close whereas ‘bad’ models fail. With increasing complexity of models, e.g. the number of parameters, the uncertainty of forecasts increases (Ormerod 2005). The actual future in complex systems is seldom exactly what it was in a forecast. Nevertheless, some systems may have redeeming properties so that forecasts are fairly good (Transtrum et al. 2015). Long-term thinking makes use of the best forecast models, built on reliable knowledge, while keeping the uncertainty in mind.

Forecasts are particularly important in facing global challenges to sustainable living environments for human life. The earth is now perceived as rather small. It is becoming increasingly obvious that human life transforms global living environments. This is not a new

(14)

phenomenon in the history of earth. The emergence of plant life, and later animal life, transformed the earth’s atmosphere by consuming CO2 and supplying O2, and vice versa, that eventually led to its present equilibrium composition. Clearly, the earth is a complex system that continues to evolve in response to the life on it. The best bet is to ensure that changes in global living environments are as slow as our ability to adapt to them. Rapid changes, ‘tipping points, are great threats to human life (Folke et al. 2011). The human challenge is to respond to forecasts of future living environments in an adequate and rational manner.

Human beings, however, do not act rationally at all times. They consciously have to force themselves to make rational decisions (Kahneman 2012). This ability can be trained and learning science is good practice for rational thought. In functioning democracies, the public understand the system and act rationally in it. Public understanding of science may also empower people to assess issues where scientific knowledge is relevant.

Applied science has proved to be of practical use in solving contemporary practical problems, but in most cases uses already existing knowledge to address the issues. If new interesting phenomena arise in applied research, they are usually not pursued because they fall outside project goals. The process is biased towards application of science that is already well known.

Because it cannot be known what problems will arise in the future, applied science cannot be relied upon to generate the new knowledge required. A sustainable science policy therefore encourages curiosity-driven, basic scientific research that is more effective in finding new knowledge. Furthermore, sustainable science policies urge applied science to be performed in an environment with norms that promote reliable knowledge.

Conclusion

It may be embarrassing to hear scientists in naïve self-confidence expressing ideas on matters outside their field of competence and science policy is way out on a limb for many (Ziman 2007). However, in the life-long process of trying to understand science itself and its role in society, it is natural to consult the available scientific literature. Thoughts presented here are not original and are shared by a number of scholars whose works have been referenced to point in a direction of more thorough analyses of the issues.

Science, natural and human, plays a number of roles in society. Most importantly, it painstakingly and inevitably constructs reliable knowledge about every aspect of human life and its natural environment. Scientists may temporarily be ‘barking up the wrong tree’, but eventually turn to more productive hunting grounds. Science is pluralistic and does not provide a single answer. It is the process of time that turns the scattered grains of sand into shining pearls of reliable knowledge. Sometimes this process is annoyingly slow, but over time norms have developed that lead to a fairly efficient production of new, reliable knowledge. Science may be too slow to respond to rapid and complex processes, but rushing it compromises the reliability of knowledge generated.

Democracy is the best bet to achieve sustainability. In times of acute need, democratic society acts and bases decisions on present values and presently available, often incomplete, knowledge. Scientists, or bureaucrats with scientific schooling, are very useful in documenting and presenting the latter as a basis for decisions, but this activity should not be confused with doing science. Science and decision making are different processes with different practices, practices that have been developed to achieve different purposes. Modes to cooperate over institutional boundaries without compromising the basic functions of the institutions themselves are needed.

(15)

Between science, on the one hand, and the spheres of politics, bureaucracy, and for that matter economy, on the other hand, there are such essential differences in roles, norms, and obligations that an institutional, intellectual, and normative division appears to be indispensably necessary. (Nybom, 2013, p. 35)

There are risks of ‘throwing out the baby with the bathwater’, i.e. the risks of producing less reliable knowledge when scientific norms are modified. On the other hand, science needs to prove its worth to democratic institutions to justify public funding of research. A sustainable science policy caters both to applied research and to production of new trust-worthy knowledge in basic research. To address global challenges transcending institutional boundaries, mutually accepted norms of cooperation among science and other democratic institutions is needed.

The importance of trust in cooperation among social institutions in democratic societies, as well as the public, has been underlined. Maintaining a high public trust in science is essential in a functioning democracy. This trust may decrease if science is blended into other institutions and subjected to their norms. Trust in science and in democratic institutions is also important in the empowerment of the public in combination with scientific training and open access to scientific results.

Sustainability science, in particular, is under siege due to many of the societal pressures discussed above. Its increasing political importance intensifies societal pressures from stakeholder interests, including sustainability scientists themselves. Scientific core values, e.g.

pluralism, the slow process of organized skepticism, and the ever-present uncertainty are seen as ‘problems’ to be mitigated. These mitigation efforts, in turn, lead to worrying hints of heteronomy within sustainability science. Governments walk a thin line between efforts to provide ample resources for generation of reliable knowledge on the pressing global challenges to sustainability and the danger of creating ‘sustainability science bubbles’. Sustainable science policies contain strategies to thwart these threats to trustworthy knowledge.

Discussion questions

1. Discuss the different situations in which science can play a role in societies? What roles has science played historically and what roles does it play today? What do you think the roles of science should be in the future in a sustainable society?

2. What is your opinion on the advantages and disadvantages of autonomy and heteronomy in a science policy perspective?

3. In what circumstances do you think “citizen science” can be a social driving force?

4. Science communication is a vulnerable to “doubt mongering” by planting falsified facts in the information flow. What countermeasures do you propose?

References

Almqvist, R. and Wällstedt, N. (2103). Managing public sector organizations: Strategic choices within changing paradigms, in L. Strannegård and A. Styhre (eds.), Management: an advanced introduction. Lund, Sweden: Studentlitteratur.

Almqvist, R., Catasús, B. and Wällstedt, N. (2103 Nov. 27). New Public Management har skapat rädda organisationer. Dagens Nyheter. http://www.dn.se/debatt/new-public- management-har-skapat-radda-organisationer/.

Anderson, C. (2008). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. Wired Magazine 16(07).

(16)

Anderson, M. S. et al. (2010) Extending the Mertonian Norms: Scientists’ Subscription to Norms of Research. The Journal of Higher Education 81(3) 366-393.

Andersson, U. (2014). Fusk och förtroende: Om mediers forskningsrapportering och förtroendet för forskning, SOM-rapport 2014 27. Gothenberg, Sweden: SOM-institutet.

Berger, P. L. and Luckmann, T. (1967). The social construction of reality: A treatise in the sociology of knowledge. Harmondsworth, England, UK: Penguin.

Bergström, A. and Oscarsson, H. (2015) Vetenskapen i Samhället – resultat från SOM- undersökningen 2014, VA-rapport 2015(2). Stockholm, Sweden: Vetenskap och Allmänhet.

Björling, M. (2012). Att ändra sin förståelse – exemplet faser och fasövergångar. In G. Fransson and H. Hammarström (eds.) Mötet mellan vetenskap och lärande 13 högskolepedagogiska utmaningar. Gävle, Sweden: Gävle University Press.

Blomquist, G. (1992) Elfenbenstorn eller statsskepp? Stat, universitet och akademisk frihet i vardag och vision från Agardh till Schück. Doctoral thesis, Lund University.

Boyd, D. and Crawford, K. (2012) Critical Questions for Big Data. Information, Communication and Society 15(5) 662-679. DOI: 10.1080/1369118X.2012.678878.

Brandt, P. et al. (2013). A review of transdisciplinary research in sustainability science.

Ecological Economics 92 1–15. http://dx.doi.org/10.1016/j.ecolecon.2013.04.008.

Bray, D. and von Storch, H. (2014). The Normative Orientations of Climate Scientists.

Science and Engineering Ethics. DOI 10.1007/s11948-014-9605-1.

https://www.researchgate.net/publication/268039102.

Brysse, K. et al. (2013). Climate change prediction: Erring on the side of least drama? Global Environmental Change, 23, 327–337.

http://dx.doi.org/10.1016/j.gloenvcha.2012.10.008.

Budtz Pedersen, D. and Hendricks, V. F. (2014). Science Bubbles. Philosophy and Technology 27 503–518. DOI 10.1007/s13347-013-0142-7

Chanley, V. A., Rudolph, T. J. and Rahn, W. M. (2000). The Origins and Consequences of Public Trust in Government: A Time Series Analysis. Public Opinion Quarterly 64 239- 256.

Dalton, R. (2005). Obesity expert owns up to million-dollar crime. Nature 434 424.

DOI:10.1038/434424a.

Dietz, T. (2013). Bringing values and deliberation to science communication. Proceedings of the National Academy of Sciences 110(3), 14081-14087. DOI 10.1073/pnas.1212740110.

Fetzer, J. H. (2004). Disinformation: The Use of False Information. Minds and Machines 14 231–240.

Fitzgerald, J. and Wolak, J. (2016). The roots of trust in local government in western Europe.

International Political Science Review 37(1) 130-146.

Folke, C., et al. (2011). Reconnecting to the Biosphere. Ambio 40, 719–738. DOI 10.1007/s13280-011-0184-y.

Gauchat, G. (2011). The cultural authority of science: Public trust and acceptance of organized science. Public Understanding of Science 20(6) 751–770.

Gibbons, M. et al. (1994) The New Production of Knowledge: The Dynamics of Science and Research in Contemporary Societies. London, UK: Sage.

Gibbons, M. (2000). Mode 2 society and the emergence of context-sensitive science. Science and Public Policy 27(3) 159-163.

Goodwin, J. and Dahlstrom, M. F. (2013). Communication strategies for earning trust in climate change debates. Wiley Online Library, Interdisciplinary Reviews Climate Change. DOI:

10.1002/wcc.262.

(17)

Gopnik, A. and Seiver, E. (2009) Reading Minds – How Infants Come to Understand Others.

Zero to Three, 30(2) 28-32.

Greenberg, D. S. (2008). Science for sale: The perils, rewards, and delusions of campus capitalism. Chicago, IL, USA: The University of Chicago Press.

Hagendijk, R. P. (2004) The Public Understanding of Science and Public Participation in Regulated Worlds. Minerva, 42 41-59.

Harris, F. and Lyon, F. (2013). Transdisciplinary environmental research: Building trust across professional cultures. Environmental Science and Policy, 31 109-119.

http://dx.doi.org/10.1016/j.envsci.2013.02.006.

Hasselberg, Y. (2007). Ytlandet. In S. Rider. and A. Jörnesten (eds.) Reclaim the Science! Om vetenskapens avakademisering. Stockholm, Sweden: Gidlunds Förlag.

Herron, J. D. (2008). Advice to my intellectual grandchildren. Journal of Chemical Education 85(1) 24-32.

Hood, C. (1995) The “New Public Management” in the 1980s: Variations on a Theme.

Accounting, Organizations and Society, 20(2/3) 93-109.

Interlandi, J. (2006-10-22). An Unwelcome Discovery. New York Times.

http://www.nytimes.com/2006/10/22/magazine/22sciencefraud.html

ICIJ, The International Consortium of Investigative Journalists (2016). The Panama Papers.

https://panamapapers.icij.org/

IPCC, Intergovernmental Panel on Climate Change (2016). Principles Governing IPCC Work.

https://www.ipcc.ch/pdf/ipcc-principles/ipcc-principles.pdf.

Jarrick, A. (2013). The Scientific Mission and the Freedom of Research. In S. Rider et al. (eds.) Transformations in Research, Higher Education and the Academic Market, Higher education Dynamics 39. Dordrecht, Netherlands: Springer Science+Business Media.

DOI 10.1007/978-94-007-5249-8_1.

Kahneman, D. (2012). Thinking Fast and Slow. London, UK: Penguin.

Kramer R. M. (1999). Trust and Distrust in Organisations: Emerging Perspectives, Enduring Questions. Annual Review of Psychology 50 569-98.

Macfarlane, B. and Cheng, M. (2008). Communism, Universalism and Disinterestedness: Re- examining Contemporary Support among Academics for Merton’s Scientific Norms.

Journal of Academic Ethics 6 67–78. DOI 10.1007/s10805-008-9055-y.

Mauser, W. et al. (2013) Transdisciplinary global change research: the co-creation of knowledge for sustainability. Current Opinion in Environmental Sustainability 5 420–

431. http://dx.doi.org/10.1016/j.cosust.2013.07.001

Merton, R. K. (1942). The Normative Structure of Science. In N. Storer (ed.) The sociology of science: Theoretical and empirical investigations (267-278). Chicago, IL, USA: The University of Chicago Press.

Nahapiet, J. and Ghoshal, S. (1998). Social Capital, Intellectual Capital, and the Organizational Advantage. The Academy of Management Review 23(2), 242-266.

National Science Board (2016). Science and Engineering Indicators 2016. Arlington, VA, USA: National Science Foundation (NSB-2016-1).

Nowotny, H., Scott, P. and Gibbons, M. (2003). ‘Mode 2’ Revisited: The New Production of Knowledge. Minerva, 41, 179–194.

Nybom, T. (2013) Power, Knowledge, Morals: Society in the Age of Hybrid Research. In S.

Rider et al. (eds.) Transformations in Research, Higher Education and the Academic Market, Higher education Dynamics 39. Dordrecht, Netherlands: Springer Science+Business Media. DOI 10.1007/978-94-007-5249-8_1.

OECD (2003) OECD Guidelines for Managing Conflict of Interest in the Public Service. Paris, France: OECD Publishing. DOI:10.1787/9789264104938-2-en.

(18)

Oreskes, N. and Conway, E. M. (2010). Merchants of doubt: how a handful of scientists obscured the truth on issues from tobacco smoke to global warming. New York, NY, USA: Bloomsbury Press.

Oreskes, N. (2015). The fact of uncertainty, the uncertainty of facts and the cultural resonance of doubt. Philosophical Transactions of the Royal. Society A, 373 20140455.

Ormerod, P. (2005) Complexity and the limits to knowledge. Futures 37 721–728.

Piketty, T. (2014). Capital in the Twenty-First Century. Cambridge, MA, USA: Harvard University Press.

Planck, M. (1936). Address on the 25th anniversary of the Kaiser-Wilhelm Gesellschaft (Jan 1936). Quoted in Makrakis, K. (1993) Surviving the Swastika: Scientific Research in Nazi Germany. Oxford, UK: Oxford University Press, 97.

Planck, M. (1950). Scientific Autobiography and Other Papers, New York, NY, USA: Open Road Integrated Media 33.

PM2.5 Firenze (2016). http://www.pm2.5firenze.it.

Popa, F., Guillermin, M. and Dedeurwaerdere, T. (2015). A pragmatist approach to transdisciplinarity in sustainability research: From complex systems theory to reflexive science. Futures, 65, 45–56.

http://dx.doi.org/10.1016/j.futures.2014.02.0020016-3287.

Ravetz, J. R. (1977). The Acceptability of Risks. London, UK: Council for Science and Society.

Regeringskansliet (2016). The Swedish Innovation Strategy.

http://www.government.se/contentassets/cbc9485d5a344672963225858118273b/the- swedish-innovation-strategy

Retraction Watch (2016). Court denies appeal of HIV fraudster’s 57-month prison sentence.

http://retractionwatch.com/2016/01/13/court-denies-appeal-of-hiv-fraudsters-57- month-prison-sentence/

Rider, S. and Jörnesten, A. (eds.) (2007). Reclaim the Science! Om vetenskapens avakademisering. Stockholm, Sweden: Gidlunds Förlag.

Rider, S., Hasselberg, Y. and Waluszewski, A. (eds.) (2013). Transformations in Research, Higher Education and the Academic Market, Higher education Dynamics 39.

Dordrecht, Netherlands: Springer Science+Business Media. DOI 10.1007/978-94-007- 5249-8_1.

Rousseau, D. M. et al. (1998). Not so Different after All: A Cross-Discipline View of Trust.

The Academy of Management Review, 23(3) 393-404.

Ruin, P. (2016 June 7). Litauisk historiker kritiserad av staten för publicering, Universitetsläraren, SULF, Stockholm, Sweden.

Scully, J. R. (2016). Open Access to Key Papers Related to the Water Crisis in Flint, Michigan.

Corrosion, 72(4), 451-453.

Searle, J. (1995). The construction of social reality. London, UK: Penguin.

Somerville, R. C. J. and Hassol, S. J. (2011). Communicating the science of climate change.

Physics Today 64(10) 48-53. DOI: 10.1063/PT.3.1296

Tallacchini, M. (2016). Old and new currents in research ethics and epistemology: the case for IAQ. Indoor Air 2016, July 3 to 8, Ghent, Belgium.

Teelken, C. (2012) Compliance or pragmatism: how do academics deal with managerialism in higher education? A comparative study in three countries Studies in Higher Education, 37(3) 271-290. DOI: 10.1080/03075079.2010.511171.

The Economist (2016, April 9). The politics of memory.

Transtrum, M. K. et al. (2015). The Journal of Chemical Physics 143 010901. DOI:

10.1063/1.4923066.

Trope, R. L. and Ressler, E. K. (2016). Mettle Fatigue: VW's Single-Point-of-Failure Ethics.

IEEE Security and Privacy 14(1) 12-30. DOI: 10.1109/MSP.2016.6.

(19)

University of Gävle (2016). Mission statement and vision.

http://www.hig.se/Ext/En/University-of-Gavle/About-the-University/Mission- statement-and-vision.html

Vetenskap and Allmänhet (2016). VA-barometern 2015/16 – VA-rapport 2015:6. Stockholm, Sweden: Vetenskap och Allmänhet.

Vosniadou, S. (ed.). (2008). International handbook of research on conceptual change. New York, NY, USA: Routledge.

Waluszewski, A. (2013) Contemporary Research and Innovation Policy: A Double Disservice.

In S. Rider et al. (eds.) Transformations in Research, Higher Education and the Academic Market, Higher education Dynamics 39. Dordrecht, Netherlands: Springer Science+Business Media. DOI 10.1007/978-94-007-5249-8_1.

Watts, R. L. and Zimmerman, J. L. (1978). The Demand for and Supply of Accounting Theories: The Market for Excuses, The Accounting Review, 54(2) 273-305.

Watts, R. L. and Zimmerman, J. L. (1990). Positive Accounting Theory: A Ten Year Perspective, The Accounting Review 65(1) 131-156.

Widmalm, S. (2013) Innovation and Control: Performative Research Policy in Sweden. In S.

Rider et al. (eds.) Transformations in Research, Higher Education and the Academic Market, Higher education Dynamics 39. Dordrecht, Netherlands: Springer Science+Business Media. DOI 10.1007/978-94-007-5249-8_1.

Ziman, J. (1996). Is science losing its objectivity?. Nature 382 751-754.

Ziman, J. (2003). In whom can we trust? European Review 11 67-76 DOI:

10.1017/S1062798703000085.

Ziman, J. (2007). Science in Civil Society. Exeter, UK: Imprint Academic.

Biographical sketch

Mikael Björling graduated from Duke University with a B.S. in chemistry and computer science in 1984. He earned a M.Sc.Eng. in chemistry from Lund Institute of Technology in 1987 and a Ph.D. in physical chemistry from Lund University in 1992. After a post-doctoral fellowship at the Royal Institute of Technology (KTH) and as C.N.R.S. researcher at the University of Paris, he was appointed docent in 1999 following a research internship at the Royal Institute of Technology. He came to the University of Gävle in 2000. His present research interests are indoor environment and pedagogical content knowledge in chemistry. E-mail: mbg@hig.se

References

Related documents

In ‘New Greek Science Curriculum (NGSC) for primary education: promoting educational innovation under hard conditions’ the authors argue that it is important for a country under a

Trust in international organisations is argued to defer national climate tax policy on grounds of effectiveness as institutional safeguards on the international

Som rapporten visar kräver detta en kontinuerlig diskussion och analys av den innovationspolitiska helhetens utformning – ett arbete som Tillväxtanalys på olika

Parallellmarknader innebär dock inte en drivkraft för en grön omställning Ökad andel direktförsäljning räddar många lokala producenter och kan tyckas utgöra en drivkraft

Munn S, Goumenou M, Report of the Endocrine Disrupters - Expert Advisory Group (ED EAG): Key scientific issues relevant to the identification and characterisation of

Ett sådant resultat bygger på friläggning och analys av enskilda språk- eller textdrag som till exempel direkt tilltal av en tänkt annonsläsare och även om det kanske inte

Moreover, the practical application of landfill emission monitoring is addressed in this thesis work using robot assisted gas tomography, a novel concept that fuses mobile

“As the scientific and social context changes and as research policy emerges as a central concern in national as well as trans-national politics, critical analyses of the