The Ecomodernists: A new way of thinking about climate change and human progress

A futuristic vision of an ecomodernist Sydney, Australia.

Nov. 1, 2018 Scientists expect this year globally to be the fourth-hottest on record, with the only warmer years being the three previous ones. Since 2001, we have lived on a planet that has experienced seventeen of the eighteen hottest years ever observed.

The alarming temperature records set over the past two decades are consistent with a century-long pattern, rigorously confirmed by multiple lines of scientific evidence: the burning of fossil fuels has driven a rise in heat-trapping greenhouse gases (GHG) in the atmosphere, which has already caused nearly a 1 Celsius (C) degree rise in global temperatures.

The impact from destabilizing Earth’s climate system are being felt by people living in every country of the world. This summer, record heat in Japan and elsewhere caused dozens of deaths. Firefighters in California struggled to control the largest forest fire on record, one of about twenty that ravaged the state. Forest fires also raged across Canada and even in the Arctic. In Europe, where fires led to deaths in Greece, record-setting heat also severely damaged crops and caused other freakish events. Rivers were so warm in some places that some nuclear reactors had to shut down because the water was too hot to cool them.

“This summer of fire and swelter looks a lot like the future that scientists have been warning about in the era of climate change,” wrote Somini Sengupta (2018) in a front-page story in The New York Times. “It’s revealing in real time how unprepared much of the world remains for life on a hotter planet.”

Teasing out the unique role played by human-caused climate change in contributing to extreme weather events (in comparison to natural fluctuations) has long been a scientific challenge. But in recent years research in the area of “attribution science” has developed into a mature field. To date, scientists have published more than 170 reports covering 190 extreme weather events around the world, according to an analysis by the journal Nature. About two-thirds of extreme weather events studied were determined by scientists to have been made more likely, or more severe, by human-driven climate change. Heat extremes accounted for 43 percent of these events, followed by droughts (18 percent) and extreme rain or flooding (17 percent) (Schiermeier 2018).

Acknowledging the threats posed by human-caused climate change, in 2015 almost all of the world’s countries pledged as part of the United Nations climate treaty to keep global temperature rise this century to lower than 2 degrees C above pre-industrial levels and to strive for a 1.5 degrees C. But to achieve this goal, greenhouse gas emissions would need to be cut by at least 70 percent by 2050 (Tollefson 2018).

As the shift away from fossil fuels to low carbon energy moves at a snail’s pace compared to what is needed, in 2017 emissions worldwide rose by nearly 2 percent, the first increase in four years. In an August 2018 lead editorial at The Economist, the typically optimistic magazine put the state of progress in the bluntest of terms, running the headline: “The World is Losing the War on Climate Change” (“The World” 2018).

In countries around the world, to replace fossil fuels the massive deployment of solar and wind power will likely need to be supplemented by thousands of advanced nuclear power plants; natural gas plants that capture and bury their emissions; and a gigantically bigger, more powerful, and vastly more complicated energy transmission and storage system. These are just the challenges in decarbonizing the electricity sector. Equally daunting obstacles exist in the agriculture and transportation sectors (Temple 2018).

As countries struggle to limit their greenhouse gas emissions and decarbonize their economies, there has emerged a space in public life for new ways of thinking about climate change, energy, and politics. In books, essays, and research, a group of intellectuals and scholars calling themselves “ecomodernists” or “ecopragmatists” have put forward a set of ideas that break from conventional thinking, challenging longstanding paradigms about nature, technology, and progress (Fahy and Nisbet 2017; Nisbet 2014).

The Decarbonization Challenge

Most of today’s rise in greenhouse gas emissions is driven by energy-hungry Asian nations seeking to rapidly grow their economies and improve the standard of living for billions of people. Between 2006–2016, energy consumption in Asia rose by 40 percent. In India, where emissions are growing the fastest, the country remains highly dependent on coal to produce three-quarters of its electricity. In 2017, the country’s use of the world’s most polluting fossil fuel grew by 5 percent (“The Year” 2018).

In Germany, even as the country has made unprecedented gains in the deployment of solar and wind power, emissions over the past two years have slightly increased. In 2011, Germany made the rash political decision to phase out its seventeen emissions-free nuclear power plants, which at the time accounted for 25 percent of the country’s electricity generation. In doing so, Germany has remained strongly dependent on some of the dirtiest coal power plants in the world for more than 40 percent of its electricity. Efforts to cut emissions have also faltered because of unexpected growth in the economy and lower oil prices, which encouraged greater use of home oil heating and car transportation (“Germany” 2017).

In the United States, the good news is that emissions have declined since their historic peak in 2007, though they still remain above 1990 levels, according to official government estimates. The decline has been driven primarily by the revolution in shale gas drilling or “fracking,” which lowered the cost of generating electricity from cleaner burning natural gas power plants, putting many dirtier and more expensive coal power plants out of business (Barboza and Lange 2018).

Questions remain, however, about how much methane is leaked into the atmosphere from natural gas production and transport. A recent study estimated that the leakage rate was 60 percent greater than the U.S. government had previously estimated. Such a discrepancy is important to evaluating the benefits of natural gas, since the atmospheric warming impact of methane during the first two decades after its release is more than eighty times more potent than carbon dioxide (Guglielmi 2018).

A glut of cheap natural gas also threatens the country’s 100 emissions-free nuclear power plants, which generate 20 percent of U.S. electricity. Because the United States does not have a national carbon tax or fee, the climate change benefits of nuclear power plants are not factored into their operating costs. Since 2013, five nuclear plants have closed and six more are scheduled to shut down by 2025, even though these older plants could still operate for decades. In most states, solar and wind power will not be able to take up the slack in electricity generation. Instead, nuclear power will be replaced by dirtier natural gas (Plumer 2017).

A bright spot may be California, the fifth largest economy in the world. Even as the state’s population has surged—its economy has grown by 40 percent over the past two decades—the carbon intensity of California’s economy (the amount of carbon pollution per million dollars of economic growth) has declined by 38 percent and is now below 1990 levels. In 2016, the most recent year for which data is available, carbon intensity declined 6 percent even as the economy grew by 3 percent (Barboza and Lang 2018).

The shift is driven by a major decline in emissions from the electricity sector. Not only have state-wide improvements in energy efficiency decreased the demand for electricity even as the economy and population have grown, a sharp drop in the price of solar panels combined with state renewable energy mandates have accelerated the transition from natural gas plants to clean energy sources. Rain in the state after five years of drought also boosted electric generation from hydropower (Barboza and Lange 2018).

Many challenges remain for California. The scheduled shuttering of the state’s last remaining nuclear power plant may shift some electricity generation back to natural gas. Emissions from cars and trucks, already the biggest source of carbon pollution in the state, continue to increase. Lower gas prices until recently have not helped, nor has consumer preference for bigger, less efficient cars and the relatively slow adoption of electric vehicles (Barboza and Lange 2018).

Continued success in California and the United States also hinges on U.S. federal policy. But the Donald J. Trump administration since taking office has installed at major regulatory and scientific agencies fossil fuel industry lobbyists and conservative operatives who have spent their careers casting doubt on climate science and opposing any policies to cut emissions. According to one recent study, the fossil fuel industry and other sectors that are major emitters enjoy a ten-to-one lobbying advantage over environmental groups and the clean energy sector (Brulle 2018). At such a disadvantage, even if Democrats were to win back control of the White House and Congress, any successful climate change–related legislation will not only need some Republican support but also the backing of major players from the fossil fuel industry.

But such concessions are likely to be opposed by many environmentalists, who have gained considerable sway within the Democratic party. To win party primaries, Democrats running in districts and states where liberal voters dominate have pledged to promote a “100% renewables” platform that opposes all new fossil fuel infrastructure, seeks a ban on natural gas “fracking,” and demands the closure of nuclear power plants (Nisbet 2015).

The Rise of Ecomodernism

The roots of ecomodernism can be traced to a handful of influential books, articles, and policy papers first published a decade ago. In 2009’s Whole Earth Discipline, ecologist and futurist Stuart Brand laid out a range of innovation-driven strategies for achieving a sustainable society. His ideas were captured effectively by the subtitle: “Why Dense Cities, Nuclear Power, Transgenic Crops, Restored Wildlands, and Geoengineering Are Necessary.”

Brand correctly warned that “soft energy path” technologies such as solar and wind favored by environmentalists were unlikely to be able to overcome the problems of intermittency, storage capacity, and cost and be scalable in time to alter the dynamics of fossil fuel energy use and dependency worldwide. He and other ecomodernists have pointed to the demand for growth in Asia, Africa, and Eastern Europe and the sunk costs that these regions are putting into coal power and other fossil fuels.

During the 1960s and 1970s, as North American and European countries achieved economic security and prosperity, their citizens began to put pressure on their governments to accelerate efforts to reduce pollution, slow rates of deforestation, and limit land use, thereby conserving nature rather than destroying it. A similar pattern is occurring in China, which through state-managed economic growth has achieved a rising, affluent middle class.

But for growth to continue in China, and for India and other developing countries to also gain access to abundant forms of energy, transformative innovations in “hard energy” path options such as nuclear energy and carbon capture and storage are required, along with similar advances in high-tech solar, energy transmission, and energy storage technologies. These advances would be needed to not only meet the demand for growth in these regions but also limit emissions from the thousands of coal plants already in place and scheduled to be built around the world.

In 2009’s Why We Disagree about Climate Change, University of Cambridge geographer Mike Hulme argued that climate change had been misdiagnosed as a conventional environmental problem. Instead, it was what policy scholars referred to as a uniquely “super-wicked” problem, not something society was going to end or solve; like poverty or war, it was something that we were going to do better or worse at managing over time. As a super-wicked problem, argue other ecomodernists, climate change is so complex in scale with so many different drivers that a single omnibus solution such as a national carbon tax or an international emissions agreement is unlikely to be either politically enduring or effective. Instead, policies would be needed to be implemented at the state, regional, and bilateral levels and through the private and nonprofit sectors (Prins and Rayner 2007; Verweij et al. 2006).

At the international level, examples include focusing more narrowly on reducing especially powerful, but easier to tackle, greenhouse gases such as black carbon (or soot) from diesel cars and dirty stoves and methane from leaky gas pipes. At the national and state levels, examples of smaller scale policies include government technology procurement programs; major investments in climate change resilience to protect cities, people, and industries; subsidies for renewables, nuclear energy, and carbon capture; funding for clean energy research; and investments in climate resilience efforts. As these smaller successes are achieved, argue ecomodernists, we not only gain more time to deal with the bigger policy challenges but also start to rebuild networks of trust and cooperation across lines of political difference while experimenting with new solutions and technologies (Nordhaus et al. 2011; Prins and Rayner 2007).

These ideas and others have been researched, expanded on, and promoted by the Breakthrough Institute, a left-of-center think tank founded by Ted Nordhaus and Michael Shellenberger. In 2015, the two brought together sixteen other similarly minded thinkers to author An Ecomodernist Manifesto. They argued that climate change and other environmental crises are not reason to call into question the economic policies and technological advances that have enabled human society to flourish over the past century. Indeed, halting the many societal gains we have achieved through technological innovation, they argue, rules out the best tools we have for combating climate change, protecting nature, and helping people. The urgent environmental problems we face are evidence in favor of more modernization, not less (Asafu-Adjaye et al. 2015).

Hope for a better future, they contend, starts with advanced technologies that intensify rather than weaken our mastery of nature. High-tech crops, advanced nuclear power, carbon capture and storage, aquaculture, desalination, and high-efficiency solar panels all have the potential to not only reduce human demands on the environment but also spark the economic growth needed to lift people out of extreme poverty. These advances will enable more people to live in bigger cities that are powered and fed more efficiently. People in cities also tend to have fewer children, slowing population growth. From this perspective, technological advances and urbanization will free up more space on the planet for nature, “decoupling” human development from fossil fuel and resource consumption.

To achieve this future, ecomodernists warn that we have put too much faith in carbon pricing, social-impact investing, venture capital, Silicon Valley, and other market-based “neoliberal” mechanisms to spur technological innovation and social change. We need to instead focus more intensively on understanding how technological advances happen and the role of government planning and spending—rather than the market—as the main driver of innovation and societal change. Once there are technologies available that make meaningful action on climate change and other problems cost less, ecomodernists predict, much of the political argument over scientific uncertainty will diminish. The challenge is not to make fossil fuels more expensive but to make their technological alternatives cheaper and more powerful.

Under these conditions, it will be easier to gain political cooperation from across the ideological spectrum and from developing countries. National leaders and their constituents are far more likely to spare nature because it is no longer needed to meet their economic goals than they are for any ideological or moral reasons. Over the past year, ecomodernist ideas have received a boost from Harvard University cognitive linguist Steven Pinker (2018), who in his best-selling book Enlightenment Now devotes his chapter on the environment to advocating on behalf of the philosophy and the need for technologies such as nuclear energy.

Pinker is part of a parallel genre of “New Optimist” authors who have been inspired by the work of Hans Rosling and affiliated data scientists. In TED talks, a recent book, and vividly illustrated graphs available at the website Our World in Data, Rosling and colleagues have shown the many ways in which human societies are flourishing in the age of climate change, countering a powerful cultural narrative that the world for decades has been in a state of escalating crisis, decline, and suffering (Rosling et al. 2018).

Valuing Dissent

For ecomodernists, technological and political progress also require respectful engagement with a diversity of voices and ideas. “Too often discussions about the environment have been dominated by the extremes, and plagued by dogmatism, which in turn fuels intolerance,” they write in the Manifesto.

At their core, ecomodernists believe in applying the Enlightenment principles of skepticism and dissent, which are essential to wise and effective decisions, especially in relation to wickedly complex problems such as climate change. Numerous social science studies demonstrate that in situations where groupthink is closely guarded and defended to the exclusion of dissenting voices, individuals and groups tend to make poorer decisions and think less productively.

In contrast, exposure to dissent, even when such arguments may prove to be wrong, tends to broaden thinking, leading individuals to think in more open ways, in multiple directions, and in consideration of a greater diversity of options, recognizing flaws and weaknesses in positions. “Learning and good intentions won’t save us from biased thinking and poor judgments,” notes UC-Berkeley psychologist Charlan Nemeth in 2018’s In Defense of Trouble Makers. “A better route is to have our beliefs and ways of thinking directly challenged by someone who authentically believes differently than we” (Nemeth 2018, 191).

Acting on these principles, the Breakthrough Institute has invested in twice yearly “Dialogues” in San Francisco and Washington, D.C., creating the rare forum where progressives, liberals, conservatives, environmentalists, and industrialists come together to debate ideas and to connect over civil, cross-cutting conversations. To elaborate on these ideas, the Institute also publishes the Breakthrough Journal and produces the podcast series Breakthrough Dialogues.

On the road to managing the many threats we face from climate change, grassroots activism and political reforms that hold the fossil fuel industry accountable are important, as is the quest for a more advanced arsenal of technological options and a reconsideration of our economic goals. But so too is investment in our capacity to learn, discuss, question, and disagree in ways that constructively engage with uncomfortable ideas (Nisbet 2014).

Unfortunately, most academics and journalists avoid challenging the powerful forms of groupthink that have derailed our efforts to combat climate change. In this regard, attacks on those who question cherished assumptions have had a powerful chilling effect. We therefore depend on risk-taking intellectuals such as the ecomodernists to lead the way, identifying the flaws in conventional wisdom and offering alternative ways of thinking and talking about our shared future.


Nisbet, M.C. (2018). The Ecomodernists: A New Way of Thinking about Climate Change and Human Progress. Skeptical Inquirer, (42) 6, 20-24.


Asafu-Adjaye, J., L. Blomqvist, S. Brand, et al. 2015. An Ecomodernist Manifesto. Oakland, CA: The Breakthrough Institute. Available online at

Barboza, T., and J.H. Lange. 2018. California hit its climate goal early—but its biggest source of pollution keeps rising. The Los Angeles Times (July 23).

Brulle, R.J. 2018. The climate lobby: A sectoral analysis of lobbying spending on climate change in the USA, 2000 to 2016. Climatic Change: 1–15.

Fahy, D., and M.C. Nisbet. 2017. The ecomodernist: Journalists who are reimagining a sustainable future. In P. Berglez, U. Olausson, and M. Ots (Eds), What Is Sustainable Journalism? London: Peter Lang.

Hulme, M. 2009. Why We Disagree about Climate Change: Understanding Controversy, Inaction and Opportunity. Cambridge, MA: Cambridge University Press.

Germany is missing its emissions targets. 2017. The Economist (November 9).

Guglielmi, G. 2018. Methane leaks from US gas fields dwarf government estimates. Nature 558: 496–497.

Nemeth, C. 2018. In Defense of Troublemakers: The Power of Dissent in Life and Business. New York: Basic Books.

Nisbet, M.C. 2014. Disruptive ideas: Public intellectuals and their arguments for action on climate change. Wiley Interdisciplinary Reviews: Climate Change 5(6): 809–823.

———. 2015. Environmental advocacy in the Obama years: Assessing new strategies for political change. In N. Vig and M. Kraft (Eds), Environmental Policy: New Directions for the Twenty-First Century, 9th Edition. Washington, D.C.: Congressional Quarterly Press, 58–78.

Nordhaus, T., and M. Shellenberger. 2007. Break Through: From the Death of Environmentalism to the Politics of Possibility. Boston, MA: Houghton Mifflin Harcourt.

———. 2013. How the left came to reject cheap energy for the poor. The Breakthrough (July 10). Available online at

Nordhaus, T., M. Shellenberger, R. Pielke, et al. 2011. Climate Pragmatism: Innovation, Resilience, and No Regrets. Oakland, CA: The Breakthrough Institute. Available online at archive/climate_pragmatism_innovation.

Pinker, S. 2018. Enlightenment Now: The Case for Reason, Science, Humanism, and Progress. London, UK: Penguin Books.

Plumer, B. 2017. Glut of natural gas pressures nuclear power, and climate goals, too. The New York Times (June 14): A17.

Prins, G., and S. Rayner. 2007. Time to ditch Kyoto. Nature 449(7165): 973.

Rosling, H., A.R. Rönnlund, and O. Rosling. 2018. Factfulness: Ten Reasons We’re Wrong about the World—and Why Things Are Better Than You Think. New York: Flatiron Books.

Schiermeier, Q. 2018. Droughts, heatwaves and floods: How to tell when climate change is to blame. Nature 560(7716): 20.

Sengupta, S. 2018. The year global warming made its menace a reality. The New York Times (August 9): A1.

Temple, J. 2018. At this rate, it’s going to take nearly 400 years to transform the energy system. MIT Technology Review (March 14). Available online at

Tollefson, J. 2018. Can the world kick its fossil-fuel addiction fast enough? Nature 556(7702): 422–425.

Verweij, M., M. Douglas, R. Ellis, et al. 2006. Clumsy solutions for a complex world: The case of climate change. Public Administration 84: 847–843.

The world is losing the war against climate change. 2018. The Economist (August 2). Available online at

The year global warming made its menace a reality. 2018. The Economist (August 2).

The gene editing conversation: Public dialogue will require major investments

In 2014 biochemist Jennifer Doudna of the University of California at Berkeley awoke from a nightmare that would shift the focus of her world-class scientific career. Two years earlier, with her colleague Emmanuelle Charpentier, now director of the Max Planck Unit for the Science of Pathogens in Berlin, Doudna had achieved one of the most stunning breakthroughs in the history of biology, becoming the first to use a process called CRISPR-Cas9 to alter the genetic makeup of living organisms. Their “gene-editing” tool would allow scientists to efficiently insert or delete specific bits of DNA with unprecedented precision.

But as applications related to modifying human genes were soon reported in the scientific literature, Doudna began to worry. In the dream, a colleague asked if she would help teach someone how to use CRISPR (Clustered Regularly Interspaced Short Palindrome Repeats). She followed him into a room to be greeted by Adolph Hitler wearing a pig face. The nightmare reinforced her belief that public discussion of the technology was far behind the breakneck pace of its emerging applications. She feared a public backlash that would prevent beneficial forms of gene-editing research from moving forward.

A version of this article appeared in the Jan/Feb 2018 issue of American Scientist magazine

Doudna organized a workshop among scientists, ethicists, and other experts; they published a 2015 paper in Science urging an international summit on the ethics of gene-editing and a voluntary pause in scientific research that would alter the genetic makeup of humans. In a TED talk that year, she called for a global conversation about gene editing so scientists and the public could consider the full range of social and ethical implications. Her 2017 book, A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution, coauthored with her former student Samuel Sternberg, follows up on these efforts.

In their book, Doudna and Sternberg systematically review the vast number of applications across the life sciences that CRISPR-Cas9 may enable. With livestock, gene editing can be used to produce leaner meat, to make livestock more resistant to infection, to remove allergens from eggs and milk, to reduce the use of antibiotics, and to achieve other outcomes that benefit human nutrition and animal welfare. In medicine, gene editing is being used to engineer mosquitoes so they no longer spread viruses such as malaria or Zika, and mice so they no longer transmit Lyme disease to ticks, thereby reducing infection rates among humans. In other applications, the gene editing of goats, chickens, and rabbits may allow pharmaceuticals to be manufactured more quickly, at higher yields, and at lower cost than by way of traditional laboratory methods. In the future, gene-edited pigs may even be a major source for lifesaving organ transplants, providing tissues that are less likely to be rejected by human patients.

In a process called somatic gene editing, scientists are exploring ways to treat diseases caused by a single mutated gene such as cystic fibrosis, Huntington’s, and sickle cell disease. The patient’s cells in the affected tissues would be either edited within the body or edited outside and returned to the patient. In both cases, the corrections would not be passed on to offspring. But in terms of human applications, the most widely debated research involves so-called germline gene editing. This process would alter sperm, eggs, and early stage embryos to protect a child against inheritable diseases such as diabetes, Alzheimer’s, and forms of cancer. But such techniques could also potentially be used to select for specific physical traits or to boost human performance by way of denser bones and greater endurance, creating so-called designer babies. In each application, as a human matured, the altered DNA would be copied into every cell, and passed on to their progeny.

Not surprisingly, public opinion surveys reveal widespread public reservations about the technology and a firm belief that scientists should consult the public before applying gene-editing techniques to humans. Given the many important considerations that gene editing raises, in 2017 the U.S. National Academies of Sciences, Engineering, and Medicine recommended that scientists invest in ongoing input from the public regarding the benefits and risks of human genome editing, and that more research be conducted to better understand how to facilitate such a process.

But to lead a national and global conversation about gene editing, scientists will need help not only from their colleagues in the humanities, social sciences, and creative arts, but also from journalists and philanthropists. Informed public discussion about gene editing is not possible without high-quality, sustained reporting from journalists with deep knowledge of the subject. And new initiatives designed to understand public attitudes, to facilitate public dialogue, and to report on the complexities of gene editing will not be possible without financial support from philanthropists and their foundations.

A skeptical public

Given that discussion of human gene editing still remains primarily confined to scientific meetings and to elite gatherings such as TED conferences, it is not surprising that a 2016 Pew Research Center survey showed that 42 percent of Americans have heard “nothing at all” about the topic, compared with 48 percent “a little” and 9 percent “a lot.” But polls also show that Americans hold fairly consistent opinions and judgments about gene editing, even as they possess very little information about the complex subject. To do so, individuals actively draw on their religious and cultural values, familiar narratives from popular culture, and similarities to past debates.

For example, in the same Pew survey, when asked about the moral acceptability of gene-editing techniques intended to give healthy babies a reduced risk of disease, only 28 percent of Americans consider the application acceptable, compared with 30 percent who say it is unacceptable and 40 percent who are not sure. Notably, among the one-third of Americans who can be classified as highly religious, only 15 percent consider such applications morally acceptable (see figure above). When asked separately if such an application meddled with nature and crossed a line that should not be crossed, 64 percent of highly religious Americans agreed with the statement.

For many religious Americans, gene editing is likely closely associated with past debates over embryonic stem cell research and fetal tissue research. In these controversies, Christian leaders mobilized opposition to government funding by framing research as a violation of religious teachings. From a traditional Christian perspective, human life begins at conception and is created in God’s image. Embryos are considered to be divinely created human beings. When scientists destroy or alter human embryos, they take on the role of God, violating divine will. Therefore, traditional Christians believe that embryo research is morally wrong and that if it is funded by the government using tax revenues, such funding makes all Americans complicit in destroying human life. In the Pew survey, for example, among those who said gene editing was morally unacceptable, more than one-third of responses made reference to changing God’s plan or violating his will.

But as various survey findings indicate, it is not just strongly religious Americans who have moral reservations about gene editing. Even among nonreligious Americans, 17 percent say that gene editing to give babies a much reduced risk of disease is morally unacceptable, and 37 percent say they are unsure. In a follow-up question, more than one-quarter of nonreligious respondents say they oppose gene editing to improve the health of a baby because it would be meddling with nature and cross a line that should not be crossed. When asked more specifically if saving a baby’s life required testing on human embryos or altering the genetic makeup of the whole population, about half of all Americans say that such scenarios would make the application less acceptable to them (see figure above). A 2016 survey conducted by Harvard University’s Chan School of Public Health finds even stronger levels of reservations. In this case, when asked about changing the genes of unborn babies to reduce their risk of developing certain serious diseases, 65 percent of Americans said that such an application should be illegal. More than 80 percent said the same when asked about gene editing to improve intelligence or physical traits.

What explains the reservations voiced by both religious and nonreligious Americans? Bioethicists have used the term Yuck Factor to describe a “visceral repugnance” and “emotional opposition” felt by the public when they first hear about human genetic engineering. This repugnance, wrote University of Chicago ethicist Leon Kass in an oft-cited 1997 article in the New Republic, is an “emotional feeling of deep wisdom,” that leads an individual to “intuit and feel, immediately without argument, the violation of things that we rightfully hold dear.” The Yuck Factor likely has its origins in Kantian and Christian philosophies of human dignity that permeate Western culture. These traditions, as political theorist Francis Fukuyama of Stanford University described in his 2002 book Our Post Human Future, emphasize that human life has a higher moral place than the rest of the natural world. Therefore, according to these philosophies, even at its earliest stages of development, human life should always be treated with a sacred respect.

Such teachings have shaped Western culture to the extent that their principles are passed on even to those who have never set foot in a church. The Yuck Factor is therefore a relatively intuitive response, a reaction formed below the level of conscious deliberation on the part of an individual, often in the absence of substantive information. When asked about emerging gene-editing techniques that would involve altering human embryos or engineering desired traits, most individuals probably have difficulty articulating why they might believe it to be morally questionable; they just know it when they feel it.

Journalism matters

Although scientists hold a responsibility to engage the public about the social implications of gene editing, informed public dialogue ultimately depends heavily on journalists and their news organizations. Quality science reporting is essential to understanding how and why gene-editing research is being conducted, including the connections between new advances and ongoing debates over funding, governance, regulation, ethics, accessibility, uncertainty, and patent rights. Even in today’s dramatically altered media landscape, coverage in print and online, at both traditional and new media outlets, still drives discussion of complex issues such as gene editing. These news organizations provide the information, frames of reference, and narratives that scientists, journalists, funders, policy makers, and societal leaders frequently draw upon to set policy, make decisions, or communicate with various segments of the public who trust their advice.

Yet for the past two decades, the news media have faced crippling economic and technological disruptions that have forced cutbacks in the amount of reporting on complex science topics such as gene editing. As University of Wisconsin-Madison communication scholar Dietram Scheufele has documented, due to layoffs there are also far fewer veteran journalists on staff who can draw on decades of experience to provide their readers critical context. Industry practices within journalism have also changed. In a business model dependent on Facebook and Google to generate traffic and advertising revenue, former New Republic editor Franklin Foer warns that journalists are being told by their editors to actively seek out trending topics that are likely to catch on or go viral, rather than to rely on their news judgment to decide what are the most important stories to tell readers. As a consequence, coverage of gene editing loses out to the latest sensational cultural event or breaking political scandal. When gene editing is covered, headlines and story angles may exaggerate the technology’s promise and peril in an effort to win scarce reader attention.

Now is the time, therefore, for scientists and philanthropists to help journalists and news organizations to correct for these pressures and biases. They can do so by sponsoring workshops where a diversity of experts and stakeholders gather to discuss with journalists and editors the scientific, ethical, and legal implications of gene editing, making it easier for journalists to cover gene editing accurately and on a regular basis. Philanthropists, universities, and research institutions can also provide fellowships and other sources of financial support that enable journalists to spend the weeks and months required to substantively report on the subject.

But journalists are not the only professionals who are needed to write compellingly about the scientific and social implications of gene editing. Scientists, ethicists, and social scientists can also contribute commentaries and articles to the popular press, offering independent insights and context. In one initiative to help facilitate such articles, the Kavli Foundation is partnering with the Alan Alda Center for Communicating Science and a number of science magazines and online publications (including American Scientist) to train scientists to apply the techniques and standards of journalism in writing about complex topics such as gene editing.

Investing in dialogue

Yet even as quality journalism provides the main architecture around which informed debate about gene editing will take place, the scientific community, along with universities, philanthropies, and research institutions, must also help create opportunities for direct public participation in dialogue and deliberation. Such an effort starts with the sponsorship of carefully conducted social-science research that assesses public discourse about gene editing, the sources of information and arguments that are shaping debate, and the factors that are influencing public attitudes. In turn, this research should inform the design and evaluation of a variety of dialogue-based communication initiatives organized by scientific organizations, government agencies, and universities.

Over the past decades, across Europe and North America, efforts to promote dialogue-based science communication have taken various forms, but as the University of Calgary’s Edna Einsiedel notes, each format shares a few common principles. First, in these initiatives, communication is defined as an iterative back-and-forth process between various segments of the public, experts, and decision-makers. Such approaches assume that there is no single “correct” way to talk about and understand the social implications of a complex subject such as gene editing. Second, rather than being top-down and controlled by scientists and their partners, societal leaders and the public are invited to be active participants in defining what is discussed, sharing their own knowledge and perspectives. Third, there is no single “public” with which to communicate or engage, but rather multiple “publics” exist. These include but are not limited to church leaders and congregations, racial or ethnic groups, parents and patient advocates, and political identity groups such as liberals or conservatives.

Among the most important types of organized dialogue initiatives are smaller, more intimate events that bring together scientists with other societal leaders to facilitate the sharing of perspectives, and the forging of relationships. In one leading example, the Dialogue on Science, Ethics, and Religion (DoSER) at the American Association for the Advancement of Science has organized workshops that convene scientists and clergy to discuss topics of mutual concern and possible disagreement such as embryonic stem cell research. To inform the discussion, focus groups were conducted in advance of the events, and the meetings were professionally facilitated. Scientists and clergy participating in the meetings indicated that the sessions helped break down stereotypes about each other, facilitating learning and mutual respect. In a related initiative, DoSER has worked with seminary schools and synagogues to develop curricula and resources that aid clergy in leading more constructive conversations about complex scientific topics with their congregations.

As these examples suggest, it is important to remember that religion is more than just a belief system that shapes how people understand gene editing. Churches are communication contexts where discussions can at times be framed in strongly moral terms by congregational leaders, reinforced by conversations that churchgoers have with others, and shaped by information provided directly when at church. For these reasons, on a topic such as gene editing, churches often serve as powerful networks of civic recruitment where congregants receive requests to voice their opinion to elected officials. During the debate over embryonic stem cell research, for example, among the strongest predictors of whether individuals had become involved politically on the issue was whether they had discussed or received information about the topic at church. In sum, when it comes to public dialogue about gene editing, scientists can either cede communication at churches to religious leaders or become active partners in facilitating and enriching church-based discussions.

Yet to promote broader public engagement across both religious and nonreligious segments of the public, the scientific community can also benefit by partnering with experts specializing in the humanities, philosophy, and the creative arts. Scholars in the humanities and philosophy draw on literature, religious traditions, and ethical frameworks to help the public consider what is good, what is right, and what is of value about a complex topic such as gene editing. Writers, artists, filmmakers, and other creative professionals are among society’s most inspiring storytellers about complex issues, and they are able to communicate about gene editing in imaginative, compelling, and novel ways. Integrated into public dialogue initiatives, their work can motivate different forms of learning, sponsor critical reflection and deliberation, and produce thought-provoking visions of the future.

In a past example that serves as a prototype for such initiatives, faculty at the University of Alberta in Canada hosted workshops in 2008 that facilitated discussions about the social implications of human genetic engineering among visual artists, scientists, bioethicists, and social scientists. Informed by their conversations together, the artists were commissioned to produce visual works reflecting on the themes discussed, while the other participants were asked to write short essays. The project culminated in the exhibit “Perceptions of Promise: Biotechnology, Society, and Art,” which toured North America. As part of the exhibit tour, forums were held at museum venues, generating local news coverage of the themes expressed. The essays along with the artistic works were published as part of a book and catalog sold at art museums, bookstores, and online.

Sean Caulfield & Roy Mills, University of Alberta; End Point, from “Perceptions of Promise: Biotechnology, Society, and Art” exhibition at the Glenbow Museum, Calgary, Alberta, November 2008–January 2009.

Apart from artistic exhibits, classic works of literature and films can also serve a similar function in stimulating public dialogue. For example, the 1997 film Gattaca is often used in college classrooms to stimulate student discussion of the social implications of human engineering. Research suggests that rather than alarming audiences, science fiction TV and film portrayals may help familiarize viewers with the moral dimensions of human genetic engineering, thereby helping them overcome their intuitive Yuck Factor reservations. This year, in recognition of the 200-year anniversary of the publication of Frankenstein, faculty at Arizona State University have published an annotated version of the novel that also features essays from scientists and scholars in the humanities and social sciences. With support from the National Science Foundation, the university is also coordinating nationwide events and activities at science museums and centers, which include exhibits, an online multimedia game, and at-home activities for use by parents. Each is carefully designed to foster discussion about the social and ethical dimensions of gene editing and other technological innovations.

For many, such broad-based initiatives may be beyond their ability to organize or to fund. Major investments in public dialogue and in supporting high-quality journalism about gene editing will take coordinated action from leaders of the scientific community and their peers across fields including the news media and philanthropy. But scientists and others should not overlook the contributions to public dialogue they can make starting right now. University scientists, by way of their classrooms and new degree programs, can partner with their peers in the social sciences and humanities to equip students with the knowledge and skills they need to think critically about the future of gene editing and similar advances. At Cornell University, for example, one model to emulate is the undergraduate major in Biology and Society. Among the most popular on campus, the major enables students to group foundational training in the biological sciences with coursework in science communication, the social sciences, and the humanities.

Within their local communities, individual scientists can also actively encourage discussions about gene editing by way of informal conversations and by volunteering to give presentations to community groups, connecting with others by way of shared interests, values, and identities. Ultimately, for Jennifer Doudna, her goal is to motivate the next generation of scientists to engage much more actively and directly with the public, applying the principle of “discussion without dictation” on how gene editing should be used. All scientists, regardless of discipline, she argues in her recent book, must be prepared to participate in conversations with the public about the far-reaching consequences of gene editing and similarly powerful technologies.


Nisbet, M. (2018). The Gene-Editing Conversation. American Scientist, 106(1), 15-19.


Caulfield, S., C. Gillespie, and T. Caulfield (eds.). 2011. Perceptions of Promise: Biotechnology, Society and Art. Edmonton, Canada: University of Alberta Press.

Doudna, J., and S. H. Sternberg. 2017. A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution. New York, NY: Houghton Mifflin.

Einsiedel, E. F. 2014. Publics and their participation in science and technology. In M. Bucchi and B. Trench (eds.). Routledge Handbook of Public Communication of Science and Technology. New York, NY: Routledge.

Foer, F. 2017. World Without Mind: The Existential Threat of Big Tech. New York, NY: Penguin.

Fukuyama, F. 2003. Our Posthuman Future: Consequences of the Biotechnology Revolution. New York, NY: Farrar, Straus, and Giroux.

Funk, C., B. Kennedy, and E. P. Sciupac. 2016. U.S. Public Wary of Biomedical Technologies to “Enhance” Human Abilities. Washington, DC: Pew Research Center. Published online June 26, updated November 2.

Kass, L. 1997. The wisdom of repugnance. The New Republic, June 2, pp. 17-26.

National Academies of Sciences, Engineering, and Medicine. 2017. Human Genome Editing: Science, Ethics, and Governance. Washington, DC: The National Academies Press.

Nisbet, M. C. 2005. The competition for worldviews: Values, information, and public support for stem cell research. International Journal of Public Opinion Research 17:90-112.

Nisbet, M., and E. M. Markowitz. 2014. Understanding public opinion in debates over biomedical research: Looking beyond political partisanship to focus on beliefs about science and society. PloS One 9(2):e88473.

Scheufele, D.A. 2013. Communicating science in social settings. Proceedings of the National Academy of Sciences, 110(Supplement 3):14040-14047.

Scheufele, D. A., et al. 2017. U.S. attitudes on human genome editing. Science 357:553-554.
Shapshay, S. (ed.). 2009. Bioethics at the Movies. Baltimore, MD: Johns Hopkins University Press.

Models of knowledge-based journalism: Brokering knowledge, dialogue, and policy ideas

April 1, 2017— In 2013’s Informing the News, the eminent journalism scholar Thomas Patterson comprehensively reviewed the evidence in support of the well-worn criticisms of our contemporary news system. Journalists too often: give equal weight to accurate representations and faulty facts and flawed opinions, focus on conflict and strategy over substance, and favor personalities, dramatic events, and infotainment over big picture analysis and context. These trends are unlikely to change unless journalists more deeply understand the subjects they cover and how their stories can affect societal decisions, he concluded. Patterson called for a new “knowledge-based journalism” in which reporters excelled not only at interviewing, investigating, and storytelling but also in applying relevant specialized expertise. “If news is to be a means of getting people to think and talk sensibly about public affairs, it needs to contain the contextual information that enables citizens to make sense of events” he urged.[1]

A version of this article appears in the Oxford Handbook of the Science of Science Communication.

The challenge for news organizations, argued Patterson, is not to cater to audience interests but to take important issues such as climate change and make them interesting. News organizations investing in knowledge-based journalism are more likely to produce content that audiences search for and recommend to others.  Such high quality content can help repair news organizations’ sagging reputations and boost their finances by giving an outlet enduring relevance and audience share in an ultra-competitive world of many online choices, he argued.

From across the Atlantic, the late German communication researcher Wolfgang Donsbach echoed Patterson’s call for journalism to stake out its role as society’s “new knowledge profession.” A specialized understanding of an expert field enables journalists to make “sound judgments on the newsworthiness of events,” he wrote. “Only then can they ask critical questions to the actors, find the right experts, and only then can they resist infiltration of non-professional factors into their decision-making.”[2]  Not only is “content” knowledge of a subject such as economics or environmental science needed, argued Patterson and Donsbach, but so too is “process” knowledge. This second dimension includes recognition of the factors that influence journalists’ news judgments, as well as the effects of news coverage decisions on audiences. Process knowledge can, for example, be applied by journalists to guard against personal biases and mistakes, to choose among different storytelling techniques that more effectively engage audiences, and to take advantage of various digital tools to enhance understanding and reach.

Building on these seminal ideas, in a 2015 essay we identified specific knowledge-based journalistic practices and media structures that might enable more constructive debate in science controversies. In doing so, we introduced three complementary models for doing knowledge-based journalism on which we elaborate in this chapter: the knowledge broker, dialogue broker, and policy broker. By combining these approaches in coverage of politicized debates, journalists and their news organizations can contextualize and critically evaluate expert knowledge and competing claims, facilitate discussion that bridges entrenched ideological divisions, and promote consideration of a broader menu of policy options and technologies.[3]

To further illuminate these models, in this chapter we draw on examples of veteran journalists whose work can serve as an inspiration for new generations of professionals. Because they are experts in their fields and have years of experience covering relevant topics or beats, these veteran journalists are able to fuse complex knowledge and on-the-ground reportage into a storyline that is clear, readable, and engaging to a broader audience. They connect the dots for readers, offering a wider lens, bigger picture, and evaluation of complex ideas and fast-moving trends. As knowledge-based journalists they often engage in deductive analysis across cases and issues, working from the top down, drawing connections, making inferences, theorizing about causes and solutions, and offering judgments. They combine the habits of mind of a scholar with the skills of a master storyteller, providing the context and explaining the ideas that enable citizens to make sense of complex science controversies and trends.

Knowledge Brokers

In the first model, journalists play an essential role as “knowledge brokers,” unpacking the process of expert knowledge production for their readers, examining how and why scientific research was done, sometimes positing alternative interpretations or drawing connections to ongoing debates about a complex problem such as mental health, climate change, or infectious disease. Knowledge brokers focus on the institutions, assumptions, ideologies, political factors, and personalities that influence the production and interpretation of scientific research. Through this perspective, readers learn not only about the basic facts of science, but also how scientific research is conducted, interpreted, communicated, and contested. These veteran journalists often apply “weight-of-evidence reporting,” a technique in which journalists seek out and convey where the preponderance of expert opinion lies on an issue.[4] Yet most journalists who apply this valuable idea strongly defer to expert judgment and do “not get into the weeds of the scientific evidence.”[5] Knowledge brokers go further, probing deeper into the specialized research they write about, examining how and why it was produced, synthesizing and comparing findings across disciplines, and evaluating its usefulness when applied to proposed solutions.

Somewhat paradoxically, only by way of this critically motivated reporting can public trust in science be maintained. Rather than portray science and scientists as truth’s ultimate custodians, knowledge brokers reveal for readers how science really works. When controversies related to fraud, bias, interpretation, scandal, hype, honest errors, or conflicts of interest emerge, those who are attentive to this form of journalism are more likely to be able to judge when such behaviors are outliers or the norm. Just as peer-review and other established norms within science serve as correctives to such failures, as outsiders knowledge brokers fulfill a similarly vital and complementary role.

Across several decades, as a prototypic knowledge broker, Scientific American staff writer John Horgan pioneered a valuable style of science criticism. Dissatisfied with the constraints of traditional reporting, he turned to more opinion-based, interpretative reporting while also looking for “exaggerated or erroneous scientific claims” to question and debunk. “I convinced myself that that was actually a good thing to do because science had become such an authority that there was a need for a scientific critic …,” he noted.  “It’s a paradox: it’s using subjectivity to ultimately get a more clear, objective picture of things.”[6]

In his award winning reporting, Horgan not only skewered the exaggerated claims of scientists who promised world-changing discoveries, but also grappled with ideas of philosophers of science. These themes coalesced in the 1996 best-seller The End of Science in which Horgan argued that science was so successful in its description of the natural world that it had reached the limits of its knowledge. No new scientific frameworks will surpass the explanatory power of Darwinian natural section and genetics in biology or the standard model in physics, he argued.[7]  The End of Science crystallized Horgan’s signature critical perspective which offered readers a consistently skeptical evaluation of the limitations of scientific knowledge. In 1999, Horgan expanded on this perspective in The Undiscovered Mind, arguing that behavioral genetics, evolutionary psychology, cognitive science and other fields had still not delivered a conclusive theory of consciousness and personality, or provided satisfying answers to other big questions.

The author of two subsequent books, Horgan also applies his critical approach in his long-running Scientific American “Cross-Check” blog, a format that benefits from his strong personal voice and trademark skepticism.  “I think that science is ill-served by its own public relations…,” he says. “I actually like to think that I’m doing good deeds for science itself and helping dispel some of these illusions that people have about science . . . I think science needs it.”[8] Inspired by the philosopher Karl Popper’s insights on the tentative, provisional nature of science, Horgan’s longstanding goal is to impart a form of hopeful skepticism which can “protect us from our own lust for answers while keeping us open-minded enough to recognize genuine truth if and when it arrives.”[9]

Veteran environmental journalist Andrew Revkin, who currently writes for ProPublica, is a second example of a knowledge broker. At his former New York Times “Dot Earth” blog, he frequently warned about the tendency for research institutions and journals to hype scientific findings about climate change and to overlook the inherent uncertainty in research. This hyping becomes amplified by advocates, journalists, and bloggers on either side of an environmental debate and by news organizations and reporters who have a strong incentive to always search for “the front page thought.” Consider the role that Revkin played as a knowledge broker in relation to a 2015 study published by the climate scientist James Hansen. Using evidence from complex computer modeling, Hansen and his sixteen co-authors warned that polar ice sheets are likely to melt at a far faster rate than previously estimated. Within a few decades, coastal cities from Boston to Shanghai could be under water, risking military conflict, mass migration, and economic collapse that “might make the planet ungovernable, threatening the fabric of civilization,” warned Hansen and his colleagues.[10]

Despite the alarming conclusions, Hansen’s study occupied an ambivalent, unsettled position within the tradition of peer-reviewed publication. It was submitted to the journal Atmospheric Chemistry and Physics where much of the peer review process occurs online in an open-access format. Over a period of months, experts are asked to read the paper and post substantive online comments. Only after reviewing the amassed expert comments do the editors decide whether the paper will be accepted for formal publication. But before the paper was posted online to undergo review, Hansen worked with a public relations firm to distribute the paper to journalists and to hold a telephone press conference at which they could ask questions. His goal, he told reporters, was to influence the outcome of international climate change negotiations to be held at the end of the year.

“Climate Seer James Hansen Issues His Direst Forecast Yet,” was the over-the-top headline of a Daily Beast article that followed the press conference. The implications of Hansen’s findings are “vast and profound,” wrote the reporter. The “blockbuster study” and its “apocalyptic scenario” presents a “huge headache for diplomats,” exploding the all too modest goals of climate diplomacy.[11]  “Earth’s Most Famous Climate Scientist Issues Bombshell Sea Level Warning,” was the same-day headline at Slate magazine. The implications of Hansen’s “breathtaking new study” are “mindboggling,” Slate told its readers. “New York City—and every other coastal city on the planet—may only have a few more decades of habitability left.”[12]

Journalists at The New York Times, Associated Press, the BBC, and The Guardian were among those who chose not to cover the paper, judging it premature to run a story before peer review had begun. Revkin at his Dot Earth blog chose an alternative strategy. In two lengthy posts, he did not merely report the specific findings of the study; instead he analyzed the authors’ apparent motivations, relating to readers Hansen’s career arc as “climatologist-turned-campaigner.” Revkin also identified key differences between arguments in the online discussion paper posted at the journal and the supporting materials supplied to journalists, which included claims that dramatic sea level rise was “likely to occur this century.” He also posted replies to emails he had sent requesting reactions to the paper from leading climatologists, many of them critical of the assumptions employed by Hansen and his colleagues.

Drawing on correspondence with two geologists, Revkin filed a review at the journal’s site arguing that Hansen’s paper contained geological evidence that could be considered too one-sided.  Other commenters at the journal subsequently questioned Revkin’s expertise. “Scientific review,” wrote one, “is for those who *know the topic* to comment, and it’s abundantly clear, that ain’t you.” Revkin in response asked the journal to clarify who was included in the “scientific community,” and who had authority to comment as part of the open review process. He then related this exchange back to the readers of his New York Times blog, including excerpts and links so that readers could follow up in more detail.[13]

As scholar Morgan Meyer writes, journalists as knowledge brokers can do more than just assess or critique science: they can also transform expert knowledge by offering new interpretations and conclusions that subsequently influence the thinking of scientists.[14]  Laurie Garrett is a leading example of this knowledge broker function. Her early-career reporting from the frontlines of global public health threats culminated in the 1994 book The Coming Plague: Newly Emerging Diseases in a World Out of Balance. That book told the story of the global spread of viruses such as HIV, tuberculosis, malaria, and Ebola, detailing how humans abetted the rise and resurgence of these infections through weak global public health systems, misuse of antibiotics and antivirals, local warfare, and refugee migration.

In The Coming Plague, Garrett integrated a diversity of disciplines into a new way of understanding infectious diseases, framing them as a unified problem manageable only by approaches that are informed by interdisciplinary research. Her work raised public awareness of infectious disease by showing readers the devastation wrought by these new plagues, boosting the profile, prestige, and funding of researchers and organizations combating diseases.[15] In 2000, Garrett followed with Betrayal of Trust: The Collapse of Global Public Health where she argued for a systemic solution to protect populations around the world from lethal epidemics.[16]  The book’s critique of health policy moved her work into the political realm. In 2004, she became a senior fellow for global health at the Council on Foreign Relations, where she combines the roles of reporter, researcher, and expert commentator, authoring popular articles, policy reports, and even serving as a script consultant to the 2011 Hollywood thriller “Contagion.”

 Dialogue Brokers

As news organizations invest in a range of innovative digital and online initiatives, a second complementary strategy for doing knowledge-based journalism is likely to prove particularly relevant. In this “dialogue broker” model, an expert journalist uses blogging, podcasts, video interviews, Twitter, Facebook, and other social media tools to convene discussions among a network of professionally and politically diverse contributors and readers.

This approach that connects a range of contributors is an example of networked journalism.[17] But the dialogue broker method is driven also by a view that dialogue can help readers to understand the viewpoint of others and accept the fact that they disagree. New York University’s Jay Rosen argued that complex, polarized debates such as those over climate change or biotechnology are unlikely to reach political consensus. But he wrote: “what’s possible is a world where different stakeholders ‘get’ that the world looks different to people who hold different stakes.”[18]

In this scenario, what are needed then are knowledge-based journalists who convene discussions that force critical reflection and examination, rather than playing to an ideologically like-minded audience. By way of blog posts and other digital tools, dialogue brokers feature multiple, contrasting perspectives, while offering context on the scientific and policy arguments made. Their original posts are often updated in light of new developments, reactions from other journalists and experts, and feedback from readers. This is “a journalism of linking rather than pinning things down, that is situated within a model of knowledge-as-process rather than knowledge-as-product,” writes new media scholar Donald Matheson.[19]

A dialogue-based form of networked journalism reflects many of the arguments of social theorists studying the politically contested terrain of issues such as climate change. As Rayner argues, progress lies not in staking out a hardline position on a contested terrain and then castigating those that are in disagreement, but in recognizing and understanding multiple positions, and finding ways to negotiate constructively among them. Dismissing alternative perspectives not only weakens our ability to understand the complexity of these issues but also risks the loss of legitimacy and trust among key constituencies, he warns.[20]

Revkin at his New York Times’ Dot Earth blog not only functioned as an explainer and informed critic of science (knowledge broker), but also served as a skilled convener (dialogue broker), using his blog and a variety of other digital tools to facilitate discussions among experts, advocates, and readers while contextualizing specific claims. His role as convener and dialogue broker at Dot Earth was informed by his reading of research in the social sciences which challenged his long held assumption as a journalist that the “solution to global warming was, basically, clearer communication: …If we could just explain the problem more clearly, people would see it more clearly, and then they would change.”[21]  At Dot Earth, to foster a dialogue with readers, he prefered posing questions, describing answers from experts and others. Revkin viewed his role as “interrogatory – exploring questions, not giving you my answer…I think anyone who tells you they know the answer on some of these complex issues is not being particularly honest.”[22]

Nathanael Johnson’s 2013 series on genetically modified (GM) food is a second example of the dialogue broker approach. His goal in the series was to go beyond the polarized thinking on the topic and he ended up brokering a conversation between critics and proponents of the technology. Through that dialogue, he promoted a shared understanding of why people disagree so strongly on the subject. As Johnson wrote, there is obvious value to journalists attempting to broker such a conversation for their audiences, especially on an issue such as GM food in which many readers tend to doubt its safety and distrust the scientists who argue on behalf of the technology. He wrote:

“If you try to cross-check the claims of people on either side of the GM debate, you  run into problems, because these warring clans speak different dialects. Their foundational assumptions point them in opposite directions, facing different landscapes and talking past each other. This can leave outsiders feeling that someone is lying. But often the miscommunication comes down to a difference in  perspectives.”[23]

Policy Brokers

Given the complexity of science controversies, and the difficulty involved in falsifying predictions about the future, it is possible for equally plausible narratives about effective policy options and solutions to exist. This ambiguity presents the opportunity for advocates to promote prescriptions that align with their vision of a “good society.” As environmental studies scholar Roger Pielke Jr. aptly notes, wickedly complex problems such as  climate change become “a bit like a policy inkblot on which people map onto the issue their hopes and values associated with their vision for what a better world would look like.”[24] In the face of such ambiguity, journalists play a key role by helping to construct a common outlook and language among networks of experts, advocates, and political leaders that aids in the coordination of decisions and actions. Yet if one problem definition and set of solutions is prioritized in news coverage to the exclusion of others, such influence can lock in powerful forms of groupthink that dismiss valuable alternative interpretations and courses of action.[25]

What is needed then is a style of knowledge-based journalism that can counter groupthink and diffuse polarization in science controversies by expanding the range of policy options and technologies under consideration by the public and political community.  This policy broker model for journalists is informed by research by Pielke Jr., who demonstrates through a series of case studies that the broader the menu of policies and technologies available to decision-makers in science-related debates, the greater the opportunity for decision-makers to reach agreement on paths forward.[26]  Writing about the climate change debate, he argued that much of the political argument over scientific uncertainty would fade — once new technologies are available. These advances would make it easier to conduct low-cost meaningful action on climate change. It would be then easier to gain support from across the political spectrum and from developed and developing countries. For example, he argued in a 2013 coauthored article that carbon capture that limits emissions from coal and natural gas power plans could “transform the political debate”. This is because the technology “does not demand a radical alteration of national economies, global trade, or personal lifestyles” and therefore “enfranchises the very groups that have the most to lose from conventional climate policies.”[27]

These conclusions are similar to those of Dan Kahan and colleagues studying the process by which the public forms opinions about controversial science topics (see Kahan, chapter…). Their findings suggest that perceptions of culturally contested issues such as climate change are often policy and technology dependent and that polarization is likely to be diffused under conditions where the focus is on a diverse rather than a narrow set of options. “People with individualistic values resist scientific evidence that climate change is a serious threat because they have come to assume that industry-constraining carbon emission limits are the main solution,” argues Kahan. “They would probably look at the evidence more favourably, however, if made aware that the possible responses to climate change include nuclear power and geoengineering, enterprises that to them symbolize human resourcefulness.”[28]

Consider how these principles apply to the role of journalists as policy brokers in the debate over climate change. Between 2007 to 2010, among those lobbying for action to address the issue, the focus was on setting a global price on carbon that would catalyze a “soft energy path” revolution, shifting the economy from a reliance on fossil fuels to dependence on wind, solar, and energy efficiency technologies. In contrast, there was much more limited attention to advanced “hard energy path” technologies such as nuclear energy or carbon capture and storage that would help to reduce emissions in ways far less transformative to the global economy.[29] In the years since, several journalists serving in the role of policy broker have helped to diversify the range of technological options considered in the climate debate, calling greater attention to hard energy path technologies and government-led innovation strategies. These journalists challenged longstanding claims by many environmentalists and activists that solar, wind, and other renewables are the only energy technologies needed to combat climate change. In doing so, they shifted policy debate away from the narrow goal of making fossil fuels more costly to a broader focus on making a diverse portfolio of low carbon technologies less expensive.[30]

In a series of columns leading up to and during the 2015 United Nations summit on climate change, The New York Times’ “Economic Scene” columnist Eduardo Porter was among the more prominent journalists playing the role of policy broker by questioning the conventional assumptions of climate advocates. Porter brought a unique perspective and background to the topic. Holding two degrees in physics, the twenty-year veteran reporter had covered business, finance, and politics from Brazil, Tokyo, London, Mexico, and Los Angeles before joining The Times in 2004 as an editorial specialist on economics.

In his columns, Porter critically assessed arguments that narrowly focused on soft energy paths and energy efficiency strategies. He also strongly challenged journalists and academics on the left flank of the environmental movement who argued that solving climate change also necessitated a halt to economic growth and an end to the global capitalist system. These longstanding arguments had recently gained considerable attention by way of Naomi Klein’s 2014 international best-seller This Changes Everything: Capitalism vs. The Climate.[31]

Porter countered that proposals like Klein’s pushing a 100 percent renewables and efficiency strategy “too often lack strong analytical foundations, and are driven more by hope than science.” In this case, “the goal of bringing the world’s carbon emissions under control is put at the service of other agendas, ideological or economic, limiting the world’s options,” he concluded. As an alternative path, Porter wrote that success on climate change “requires experimenting intensely along many technological avenues, learning quickly from failures and moving on.”[32] Drawing on various studies and analyses, Porter argued for investment in carbon capture and storage technologies and for the expansion of nuclear power. These technologies in combination with renewables would be needed to rapidly decarbonize the world economy while meeting the demands for growth from India, China, Africa and the rest of the developing world. They will also be required as backup power sources for intermittent solar and wind technologies.

Porter similarly warned that arguments promoting the need for negative economic growth threatened to derail the UN climate negotiations. “Whatever the ethical merits of the case, the proposition of no growth has absolutely no chance to succeed,” he wrote. Interviewing historians and economists, he noted that by reducing the competition for scarce resources, economic growth over the past century had delivered enormous societal benefits, helping to reduce war and conflict, enabling consensual politics and democracy, and empowering women.  Even if Klein and her allies were correct that climate change meant the upending of capitalism and globalization, Porter doubted “it would bring about the workers’ utopia” they imagined. Instead in a world without economic growth, the conflict for scarce resources would mean that the powerless and most vulnerable were the most likely to suffer, he warned. Rather than putting an end to capitalism, the world’s poor could best be served by developing a broad menu of new energy technologies that shift the world away from fossil fuels.[33]

Journalism in Turbulent Times

The advent in recent years of several innovative digital news ventures focused on deeper forms of explanatory, analytical, and data-driven journalism suggests that at least some news industry leaders, investors, and philanthropists have recognized the need for new forms of knowledge-based journalism. In 2015, the billionaire owner of The Boston Globe launched STAT, a deep vertical digital news organization covering the health, medical, and life sciences. “Over the next 20 years, some of the most important stories in the world are going to emerge in the life-sciences arena,” said STAT founder John Henry. The goal of STAT is to be “the country’s go-to news source for the life-sciences.”[34] To report on and analyze the life sciences, STAT hired a roster of knowledge-based journalists with dozens of years of combined experience covering the beat. Examples include regular columnists Sharon Begley who “who goes behind the headlines to make sense of scientific claims” and Ivan Oransky and Adam Marcus of the Retraction Watch blog who focus on issues of misconduct, fraud, and scientific integrity.

In other examples, the startup news site, co-founded in 2014 by former Washington Post “Wonkblog” writer Ezra Klein, focuses on explanatory journalism with a type of Wikipedia-like tagging of terms and concepts that gives readers in-depth background on an issue, delivered by way of the latest digital design techniques.[35]  Launched in 2014, “The Upshot” at the New York Times is a blog-like section that aims to enhance reader understanding of news through analysis and data visualization with contributions from journalists and academics, enabling readers to “grasp big, complicated stories so well that they can explain the whys and how’s of those stories to their friends, relatives and colleagues.”[36]  The Washington soon followed, creating a series of science, technology, and environment-focused blogs in which journalists contribute daily reporting, analysis, and commentary. The online startups Buzzfeed and Mashable have hired veteran science journalists to contribute deeper reported news stories. Bloomberg, Politico, and Energy & Environment News have invested in deep vertical coverage of science, technology, and environmental policy respectively, funded by way of subscriptions and advertising that target the business, advocacy, and lobbying communities. Philanthropists and foundations have also underwritten the launch of notable non-profit news ventures such as, Inside Climate News, Climate Central, and The Conversation, while continuing support of coverage at outlets like Mother Jones, The Nation, and public radio.

These for-profit and non-profit news ventures are not without their limits and trade-offs, have yet to prove their sustainability, and deserve critical scholarly analysis. Among the relevant questions: how do audiences interpret the mix of news, analysis, and opinion found across these outlets, especially as content is accessed, shared, and commented on by way of social media? How do knowledge-based journalists gain and maintain their credibility and following in an era of partisan audiences? What influence does the advertising, subscription, and funding model of a news organization have on journalistic decisions and the interpretation of complex issues like climate change or food biotechnology?

For many university journalism programs, these new media ventures and questions are the latest evidence that they need to rethink their traditional trade school focus on interviewing and storytelling skills. Indeed, with journalism programs under pressure because of languishing enrollment, their future may depend on shifting to more effectively meet the needs of society and the profession. Their future may depend less on enrolling undergraduate majors and Masters students, but in retraining students and professionals with backgrounds in specialized fields, offering them a variety of minors, certificates, badges, short courses, and fellowships. In this regard, philanthropists can play a vital role, underwriting specialized programs that meet the need for a new kind of knowledge-based journalist and communicator. At the University of Toronto, for example, a unique program recruits academics and professionals with existing subject matter expertise and trains them to pitch stories to news organizations as freelance journalists covering their own disciplines.[37]  In all, the complementary models and examples of knowledge-based journalism that we describe in this chapter are a starting point to learn from and evaluate. Research, vision, and leadership will be needed to bring about the shifts needed in how journalism covers science and its various controversies, but, in the process, there are already many bright examples to build on.


Nisbet, M.C. & Fahy, D. (2017). Models of Knowledge-based Journalism. In Jamieson, K.H., Scheufele, D.A. & Kahan, D. (eds), The Oxford Handbook of the Science of Science Communication. New York: Oxford University Press, 273-282.


[1] Patterson, Thomas E. Informing the news. Vintage, 2013, p. 93.

[2] Donsbach, Wolfgang. “Journalism as the new knowledge profession and consequences for journalism education.” Journalism 15, no. 6 (2014): 661-677, 668.

[3] Nisbet, Matthew C., and Declan Fahy. “The Need for Knowledge-Based Journalism in Politicized Science Debates.” The ANNALS of the American Academy of Political and Social Science 658, no. 1 (2015): 223-234

[4] Sharon Dunwoody. “Weight-of-evidence reporting: What is it? Why use it? Niemen Reports 59, no. 4 (2005): 89-91.

[5] Kohl, Patrice Ann, Soo Yun Kim, Yilang Peng, Heather Akin, Eun Jeong Koh, Allison Howell, and Sharon Dunwoody. “The influence of weight-of-evidence strategies on audience perceptions of (un)certainty when media cover contested science. Public Understanding of Science (2015) DOI: 10.1177/0963662515615087

[6] Fahy, Declan, and Matthew C. Nisbet. “The science journalist online: Shifting roles and emerging practices.” Journalism-Theory Practice and Criticism 12, no. 7 (2011): 778-93, 787.

[7] Horgan, John. The end of science: Facing the limits of knowledge in the twilight of the scientific age. Basic Books, 2015.

[8] Personal interview with second author, January 2014.

[9] Horgan, J. (2000). The Undiscovered Mind: How the Human Brain Defies Replication, Medication, and Explanation. Simon and Schuster, 13.

[10] Hansen, James, Makiko Sato, Paul Hearty, Reto Ruedy, Maxwell Kelley, Valerie Masson-Delmotte, Gary Russell et al. “Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2◦ C global warming is highly dangerous.” Atmospheric Chemistry and Physics Discussions 15, no. 14 (2015): 20059-20179.

[11] Hertsgaard, Mark, “Climate Seer James Hansen Issues His Direst Forecast Yet,” The Daily Beast, July 20, 2015, accessed January 15, 2016,

[12] Holthaus, Eric, “Earth’s Most Famous Climate Scientist Issues Bombshell Sea Level Warning,”, July 20, 2015, accessed January 15, 2016,

[13] See Andrew C. Revkin, “Whiplash Warning When Climate Science is Publicized Before Peer Review and Publication,” The New York, July 23, 2015, accessed January 16, 2016 and Andrew C. Revkin, “A Rocky First Review for a Climate Paper Warning of a Stormy Coastal Crisis,” The New York, July 25, 2015, Accessed January 16, 2016

[14] Meyer, Morgan. “The rise of the knowledge broker.” Science Communication 32, no. 1 (2010): 118-127.

[15] Garrett, Laurie. The coming plague: newly emerging diseases in a world out of balance. New York: Farrar, Straus and Giroux, 1994.

[16] Garrett, Laurie. Betrayal of trust: the collapse of global public health. Oxford University Press, 2003.

[17] Adrienne Russell. (2011). Networked: A Contemporary History of News in Transition. London: Polity.

[18] Jay Rosen, “Covering Wicked Problems: Keynote address to the 2nd UK Conference of

Science Journalists,” PressThink Blog, June 25, 2012, Accessed January 15, 2016

[19] Matheson, Donald. “Weblogs and the epistemology of the news: Some trends in online journalism.” New media & society 6, no. 4 (2004): 443-468, 458.

[20] Steve Rayner, Wicked problems: clumsy solutions—diagnoses and prescriptions for environmental ills. Jack Beale Memorial Lecture on Global Environment, University of New South Wales, Sydney, Australia, July, 2006. Accessed January 15, 2016

[21] Revkin, Andrew C. “My climate change.” Issues in Science and Technology, Winter (2016): 27-36, 32.

[22] Curtis Brainard, “Dot Earth Moves to Opinion Section,” Columbia Journalism Review Online. 2010, April 1, Accessed January 15, 2016

[23] Nathanael Johnson, “The GM safety dance: What’s rule and what’s real,” The, 2013, July 10, Accessed January 15, 2016

[24] Pielke Jr., Roger A, 2010. The climate fix: what scientists and politicians won’t tell you about global warming. Basic Books, 62.

[25] Nisbet, Matthew C. “Disruptive ideas: public intellectuals and their arguments for action on climate change.” Wiley Interdisciplinary Reviews: Climate Change 5, no. 6 (2014): 809-823.

[26] Pielke Jr., Roger A. The honest broker: making sense of science in policy and politics. Cambridge: Cambridge University Press, 2007.

[27] Sarewitz, Daniel and Pielke Jr. Roger. “Learning to live with fossil fuels.” The Atlantic, April 24, 2013. Accessed January 15, 2016

[28] Kahan, Dan. “Fixing the communications failure.” Nature 463, no. 7279 (2010): 296-297, 297. For more specific to how Pielke and Kahan’s research can be applied to specific strategies in science policy debates, see Nisbet, Matthew C. “Engaging in science policy controversies.” Routledge Handbook of Public Communication of Science and Technology (2014): 173.

[29] Nisbet, Matthew C. “Climate shift: Clear vision for the next decade of public debate.” American University School of Communication (2011). Accessed January 15, 2015

[30] Nisbet, “Disruptive Ideas”.

[31] Klein, Naomi. This changes everything: capitalism vs. the climate. Simon and Schuster, 2014.

[32] Eduardo Porter, “Climate Change Calls for Science Not Hope,” The New York Times, June 23, 2015, Accessed January 15, 2015

[33] Eduardo Porter, “Imagining a World Without Growth,” The New York Times, December 1, 2015, Accessed January 15, 2016

[34] Healy, Beth. “Globe’s owner unveils site focused on health, life-sciences,” The Boston Globe, November 4, 2015, Accessed January 15, 2016

[35] Klein, Ezra, Melissa Bell, and Matt Yglesias, “Welcome to Vox: A work in progress,”, April 6, 2014, Accessed January 15, 2016

[36] Leonhardt, David, “Navigate news with the upshot,” The New York Times, April 22, 2014, Available at

[37] Rosenstiel, Tom, “Why we need a better conversation about the future of journalism education,”, April 15 2013, Accessed Rosenstiel, Tom. 15 April 2013. Why we need a better conversation about the future of journalism education.

[38] Accessed January 15, 2016

Don’t fear a Franken public: The surprising reasons why we should label genetically modified foods

May 1, 2016—In January 2016, Campbell Soup generated headlines by announcing that it would voluntarily label its products containing genetically modified (GM) corn, soy, beets, and other crops. Like most food industry leaders, about three quarters of Campbell Soup products contain such ingredients.

The company’s announcement came in advance of a summer deadline set by Vermont requiring the labeling of GM foods sold in the state. Legislatures in more than twenty states have considered similar requirements. Food industry groups have lobbied for congressional legislation preempting any state requirements, encouraging voluntary disclosure. But Campbell Soup is notable for breaking with this strategy, calling instead for mandatory labeling (Strom 2016).

Contrary to the claims of “Frankenfood” opponents, research shows that Americans have not turned against the promising technology. Most remain unaware of the debate. If asked directly, Americans voice support for labeling, but these opinions are neither deeply held nor top of mind.

In this context, Campbell Soup’s strategy is a shrewd gamble that could lead to several counterintuitive yet welcome outcomes. If Americans were to encounter GM labels on almost all processed foods, the ubiquity and apparent safety of such foods may actually bolster public trust and confidence, quelling controversy and opening the door to a next generation of GM food products that offer enormous benefits.

Science vs. Movement Politics

According to the U.S. Food and Drug Administration and other expert organizations, GM foods in comparison to other food products do not pose substantial risks to human health. Thus, federal regulators, experts, and industry members argue that there is no scientific or legal justification for special labeling.

Yet a few discredited studies provide just enough rhetorical fodder for activists to falsely claim that the technology poses a health threat. In the face of such uncertainty, they argue that precaution should be the rule. Therefore, consumers have a “right to know” if they are consuming GM ingredients.

For these activists, the debate over the scientific justification for labeling is a smokescreen that clouds deeper-rooted grievances. In this sense, no amount of scientific evidence will soften their opposition. The origin of these grievances can be traced to the rise of America’s local food movement.

During the early 2000s, looking across survey findings, researchers concluded that most Americans were unaware of GM food products, lacked basic knowledge of the science or policy specifics involved, and had yet to form strong opinions about the issue (Shanahan et al. 2001).

But among a smaller segment of consumers, the issue was emerging as a chief concern, correlated with a cluster of other food-related attitudes. Those few Americans who said they actively looked to buy GM-free food also said that they preferred their food to be organic, vegetarian, natural, locally produced, not processed, and without artificial colors or flavors (Bellows et al. 2010).

These consumers were early adopters of many of the beliefs and preferences that constitute today’s local food movement. The origins of the movement date back to the 1980s and a series of food safety controversies. Since then influential activists, food writers, and documentary filmmakers have argued the connections between industrial food production, agricultural policy, and problems such as obesity, income inequality, food-borne illness, and the decline of community life (Pollan 2010). In doing so, they have contributed to a new food politics, helping a diversity of groups unify behind a movement pushing for food system reforms.

From Portland, Maine, to Portland, Oregon, many regions have rebuilt their economies and identities around locally owned, mostly organic farms, restaurants, and artisanal foods. These local efforts are complemented by the popularity of well-known national organic brands such as Stonyfield Farms and Horizon. In 2014, U.S. consumption of organic fruits, vegetables, dairy, breads, meat, and other foods generated an estimated $35 billion in sales, more than triple the amount from a decade ago (USDA n.d.).

The growth in the organics industry and local food economies has created a formidable alliance of farmers, entrepreneurs, and activists who bring considerable money, influence, and voice to the debate over labeling. For this alliance, corporate controlled, “unnaturally” produced GM food is perceived as a direct threat to their livelihood and preferred way of life.

Labeling: Not a Big Deal?

Simmering at the grassroots level for years, in 2012 the labeling of GM food exploded into prominence as a hotly debated political issue. In successive years, California, Washington, Oregon, and Colorado residents considered and eventually voted down proposals to label GM food products. In these battles, the food industry is estimated to have spent more than $100 million to block labeling efforts, while activists and organic industry members spent tens of millions promoting the measures.

These battles across Western states generated considerable national media coverage. Yet despite the attention, carefully designed survey research suggests that broader public awareness remains remarkably low. For at least a decade, the great majority of processed foods sold in grocery stores have contained ingredients from GM crops. But when asked in a 2013 Rutgers University survey about the matter, only 44 percent of Americans said they were aware of such foods, and only 26 percent believed that they had ever eaten any food with GM ingredients (Hallman et al. 2013).

A majority of Americans in 2013 said they know very little or nothing at all about GM foods, and 25 percent said they had never heard of them. Even among those who answered they were aware of the issue, a majority mistakenly believed that GM tomatoes, wheat, and chicken products were being sold in supermarkets (Hallman et al. 2013). Specific to labeling, if asked directly, 80 percent of the public said that it was either “very important” or “somewhat important” to know whether a product contains GM food. Yet these labeling preferences are weakly held. In the 2013 Rutgers survey, when respondents were asked in an unprompted way “What information would you like to see on food labels that is not already on there?” only 7 percent said GM food labeling. Moreover, only one in four Americans knew that federal regulations do not currently require such labels (Hallman et al. 2013).

Given the public’s ambivalence about labeling, economists have long questioned claims that labeling would deter the great majority of consumers from purchasing GM food products. For most Americans, cost and brand preference rather than labeling drives their food choices. To the extent that most organic foods today cost 50 to 100 percent more than their GM counterparts, this price difference is likely to override any impact of labels.

To test these assumptions, economists Marco Costanigro and Jayson Lusk designed a series of experiments that asked a sample of American adults to choose among apples and Cheerios that were either labeled as genetically modified or were unlabeled. To simulate the price differences for these products, those marked as genetically modified were priced at half the cost of their unlabeled counterparts. Across conditions, the economists did not observe any significant impact of labeling on risk perceptions or concern. Subjects rated GM apples and Cheerios just as safe as their non-modified counterparts. The economists, however, did find that a GM label made consumers somewhat more willing to pay a premium for unlabeled apples and Cheerios. In other words, though GM labels are unlikely to raise undue alarm among consumers, such labels may indirectly help boost sales of organic food products (Costanigro and Lusk 2014).

Citing this research and other evidence that labels are not likely to scare the public, some experts have argued that if the food industry were to follow the lead of Campbell Soup and support a mandatory labeling law, the strategy would help to restore public trust in the food industry while defusing controversy. “People are getting increasingly scared of [GM food] precisely because the industry is fighting a rearguard battle not to tell people which foodstuffs contain them,” argues author and writer Mark Lynas (2013). “This has to be the worst PR strategy ever: can you think of a single analogy where an industry uses every media tool, every electoral and legal avenue possible to stop people knowing where their own products are used?”

As David Ropeik (2013), a risk communication consultant, argued in an open letter to the food industry: “Even if you win the vote, you will lose the war … because the war isn’t about labeling. It’s about the public’s lack of trust in you, and therefore their opposition to the technology that is so important to your success. Your company’s opposition to labeling is hurting you far more than it’s helping. It is time for a new approach.”

Defusing Controversy

Certainly if the food industry were to support mandatory GM labeling, the precise impact on consumers remains unkown. But to continue to battle against labeling rules is also risky business, lending credibility to claims by activists that the industry has an undue, corrupting influence on the political process. In contrast, the labeling of GM food may have only a limited impact on consumer buying habits, while doing little to alarm the public about the safety of the technology. Putting an end to the labeling controversy is also likely to benefit public debate over the next generation of genetically engineered foods, ensuring that scientists, universities, and companies have the freedom to pursue breakthrough technologies.

These innovations are aimed directly at helping the world meet a 70 percent increase in food demand by 2050. Some crops have been engineered to counter deficiencies in vitamin A and iron among populations in developing countries. Other GM crops are able to survive under conditions of drought, extreme heat, or unfavorable soil conditions (Wohlers 2013). After many years of evaluation, in 2015 a genetically engineered salmon became the first modified animal approved for human consumption by the U.S. government. The small company that pioneered the high-tech salmon says that they can be grown in half the time and using 25 percent less small wild fish as feed. The system recycles 95 percent of the water used and reduces harmful waste. The all-female sterile fish are raised in landlocked tanks, making escape into the wild unlikely. Currently produced in Panama, the plan is for the fish to be grown close to large U.S. urban areas, reducing the energy costs associated with transportation (Saletan 2015).

Activists have moved quickly to oppose such “Frankenfish,” pressuring major grocery store chains and restaurants to refuse to sell the sustainability-friendly product. Apart from unsupported claims about environmental and health risks, their chief complaint is that the fish would not be labeled. As the case of engineered salmon suggests, as important high-tech crops and farming practices are brought to market in coming years, the chief strategy of GM food opponents to appeal to the public’s “right to know” can be removed from the table by pushing for a smart, mandatory labeling policy.

–This article originally appeared in the May/June 2016 issue of Skeptical Inquirer magazine.


Nisbet, M.C. (2016, May/June). Don’t Fear a Franken Public: The surprising reasons why we should label genetically modified food. Skeptical Inquirer magazine, 18-21.


Bellows, A.C., G. Alcaraz, and W.K. Hallman. 2010. Gender and food, a study of attitudes in the USA towards organic, local, US grown, and GM-free foods. Appetite 55(3): 540–550.
Costanigro, M., and J.L. Lusk. 2014. The signaling effect of mandatory labels on genetically engineered food. Food Policy 49: 259–267.
Hallman, W.K., C.L. Cuite, and X.K. Morin. 2013. Public perceptions of labeling genetically modified foods. Working Paper 2013-01. Rutgers Uni-
versity. Available online at
  • Lynas, M. 2013. It’s time to label GMOs: Why we need to move biotech out of the shadows. The (October 23). Available online at
Pollan, M. 2010. The food movement, rising. New York Review of Books (June). Available online at
Ropeik, D. 2013. GMO labeling: An open letter to BigAgTech CEOs. The Huffington Post (November 6). Available online at
Saletan, W. 2015. Don’t fear the Frankenfish. (November 20). Available online at
Shanahan, J., D. Scheufele, and E. Lee. 2001. Trends: Attitudes about agricultural biotechnology and genetically modified organisms. The Public Opinion Quarterly 65(2): 267–281.
Strom, S. 2016. Campbell labels will disclose G.M.O. ingredients. The New York Times (January 7). Available online at
USDA n.d. Organic market overview. United States Department of Agriculture. Available online at
Wohlers, A.E. 2013. Labeling of genetically modified food: Closer to reality in the United States? Politics & Life Sciences 32(1): 73–84.

The science journalist online: Shifting roles and emerging practices

October 1, 2011 — Science journalists in the US and UK face unique pressures adapting to the social and participatory nature of online news, to economic conditions that force them to fill a diversity of roles in the newsroom, and to the many hats they must wear if they are to survive as freelancers.

This article summarizes a peer-reviewed study published at Journalism on Sept. 8, 2011.

As a consequence, science journalists in writing for online media have shifted away from their traditional role as privileged conveyors of scientific findings to a diversity of roles as curators, conveners, public intellectuals and civic educators, roles that are underwritten by the essential skills of criticism, synthesis and analysis.

These online science journalists have a more collaborative relationship with their audiences and sources and are generally adopting a more critical and interpretative stance towards the scientific community, industry, and policy-oriented organizations. Those are just a few of the key conclusions from a new peer-reviewed study that we published this month at Journalism: Theory, Criticism and Practice.  We based our analysis on a systematic review of recent studies and reports and on interviews that we conducted with nationally prominent science journalists and writers in the US and UK.

A Typology of Roles for Journalists

We began our analysis by systematically reviewing studies that describe the changing nature of science journalism and public affairs journalism more generally.  We also reviewed related recent discussions at news outlets and public forums.

Our goal was to identify the emerging practices for science reporters in this new digital era and the multiple roles that journalists are adopting.  Based on this process, to guide our investigation, we developed a typology of journalistic roles.  Typologies are valuable tools, enabling researchers to validly categorize many observations based on multiple attributes.  A chief goal of this paper was to be able to classify the roles adopted by science journalists so that these roles can be further examined, refined and tracked across future studies. We identified the following roles for online science journalists:

  • The conduit explains or translates scientific information in their reporting from experts to non-specialist publics.
  • The public intellectual synthesizes a range of complex information about science and its social implications – in which the writer has a degree of specialization – presenting that information from a distinct, identifiable perspective.
  • The agenda-setter identifies and calls attention to important areas of research, trends and issues, coverage of which is then picked up and reflected in other science news outlets.
  • The watchdog holds scientists, scientific institutions, industry and policy-orientated organizations to scrutiny.
  • The investigative reporter carries out in-depth journalistic investigations into scientific topics, especially where science meets public affairs.
  • The civic educator informs non-specialist audiences about the methods, aims, limits and risks of scientific work.
  • The curator gathers science-related news, opinion and commentary, presenting it in a structured format, with some evaluation, for audiences.
  • The convener connects and brings together scientists and various non-specialist publics to discuss science-related issues in public, either online or physically.
  • The advocate reports and writes driven by a specific worldview or on behalf of an issue or idea, such as sustainability or environmentalism.

Journalists and Commentators Interviewed for the Study

Once establishing this typology, we then conducted interviews with a sample of journalists to determine whether these categories appeared to be valid descriptions of their activities and professional roles.  While recognizing some of the method’s limitations, we judged this to be the best means for gaining rich data about how journalists interpret the changes in their professional roles and routines over the past decade.

We chose for our sample journalists who based on their organizational affiliation and status we considered to be paradigmatic cases, professionals who highlight general characteristics of online science journalists and commentators.  These journalists serve as major reference points for others in the US and UK.

Four interviewees were chosen because they occupy prominent roles in elite, legacy media outlets.  Andrew Revkin, former environment correspondent for the New York Times, writes the Dot Earth blog for the newspaper and was recognized this year by the National Academies for his “pioneering social media” about climate and sustainability with “worldwide readership and impact.” James Randerson is environment and science news editor with the Guardian, and Alok Jha is science correspondent at the same paper. Curtis Brainard is editor of The Observatory column at Columbia Journalism Review.

Two interviewees were chosen because they write for the traditional popular science magazines Scientific American and Discover.   John Horgan is the author of several popular science books, including The End of Science (1996), and writes the Cross-check blog for Scientific American.   Ed Yong writes the Not Exactly Rocket Science blog at Discover and is past winner of the National Academies Online Science Journalism Award for “engaging and jargon-free multimedia storytelling about science in the digital age.”

Eli Kintisch was chosen because he works as a journalist for the journal Science, writes for the magazine’s Science Insider blog, and is author of Hack the Planet, an examination of geo-engineering.  Three others were chosen because they write for innovative online science media endeavors. Mike Lemonick is senior writer at ClimateCentral and previously with Time magazine where he contributed more than 50 cover stories over 20 years.  Charles Petit is lead writer at MIT’s Knight Science Journalism Tracker and covered science for the San Francisco Chronicle for more than 25 years before moving on to US News & World Report.  David Roberts is staff writer and blogger at Grist, a left-leaning news and commentary site about the environment.

Pulitzer Prize-winning reporter Deborah Blum was chosen as she combines the prominent roles of freelance science journalist, popular science book author, professor at the University of Wisconsin, and blogger with the non-profit organization PLoS, which aims to make the world’s scientific literature freely available.

Each of these interviewees were asked a common set of open-ended questions on how they defined their own roles, their relationship with readers and sources, how these relationships may have changed in the digital age, and their view on the state of contemporary science reporting. They were also asked if they regarded their work as fitting into each of the proposed categories of science reporters and, if so, how.  Interviews were recorded with permission and lasted between 45 minutes to an hour.

Science Journalism that Is Participatory, Social, and Pluralistic

As science and society scholar Brian Trench notes, for several decades, science reporters have held a privileged status as “the principal arbiters of what scientific information enters the public domain and how it does it,” a gate-keeping role that simultaneously enhanced the status of reporters, the authority of scientists, and the prestige of their institutions. Moreover, science reporting has tended to conform to a transmission communication model in which information was relayed faithfully “from privileged sources to diverse publics.”

The current “digital age” of science reporting, however, is uniquely characterized by self-publishing online via blogs, social media and personal websites while also simultaneously filing traditional edited and vetted stories.

At the same time, individual scientists are using blogs and other social media to communicate their work and agendas directly with various publics, creating a challenge for science reporters to not only cover the publication of new scientific knowledge in journals, but also to analyze and interpret scientific findings as they are being discussed online.

As a partial consequence, there has been a dramatic expansion online in the availability of science-related information and a perceived diminished role for science reporters as chief disseminators of scientific content. As Eli Kintisch of Science magazine and Science Insider told us:

Today there are much lower barriers between my audience and information, especially information reporters used to have sort of privileged access to, that includes today digital copies of scientific papers and main sources of information such as podcasts of news conferences, transcripts of speeches, or hearings. In the past, reporters were the only ones, now there is much more broad access, including the fact that scientists themselves have blogged about the paper or event. So information goes straight to the Internet audience, versus before there was more of a privileged role of reporters as an intermediary.

In addition, scientific publishers and societies, universities, science centers and museums, and interest groups are communicating directly with wider audiences, unmediated by journalists, often using narrative and presentation formats that were once the exclusive domain of news organizations, many even employing veteran science journalists as communication staffers. Scholars of science policy and communication, as well as critics and writers, are also producing science-related content directly online.

According to science journalism scholar Trench, these trends have created an “overlapping information and communication space” in which scientists, journalists, advocates, and the people formerly known as audiences are all content contributors, each with varying knowledge, background and perspectives.

This shift in the science journalism field parallels broader trends towards employing new digital formats and practices in public affairs media that enable non-journalists to be active co-producers of news content, engaging in ‘pro-am’ [professional-amateur] reporting on issues and events and adding their lay expertise and knowledge.

As a result, online science news and content has the potential to be highly participatory, social, and collaborative. In the United States, according to the Pew Research Center, more than one third of internet users report that they have contributed to the creation of news generally, commented about it, or disseminated it via postings on social media sites like Facebook or Twitter.

However, even as the media system rapidly evolves, the traditional agenda-setting function of news media continues online, with national legacy media in the USA, such as the New York Times or the Washington Post, influencing the agenda of major public affairs-related blogs.  As other Pew studies show, more than 99 percent of links at blog posts reference original reporting or commentary appearing first at the traditional legacy media.  Just four outlets – the BBCCNN, the New York Times and the Washington Post – accounted for fully 80% of all links.

Deep Diving “Science Publics”

In this new media landscape, highly motivated users – who usually hold personal, professional, or strong political affinities for a field of science, an area of research, or a policy debate such as climate change, evolution, or stem cell research – can “deep dive” into specific science-related subjects.

These “science publics” consume, contribute, recommend, share, and comment on news and discussion of their preferred topics across media and platforms.  They expect high standards and quality for content, and they expect that content be interactive and responsive to their feedback, reposting, forwarding, or commenting. As Curtis Brainard, who covers the science beat for the Columbia Journalism Review, told us:

Rather than having a readership that remains dedicated to your publication or any single publication, you’ve got readers who will find you when you’ve got something good. There’s that ability for stories from even the smallest publications, whether that be the Columbia Journalism Review or any other small newspaper, to really go viral and get a lot of national and even international attention.

A diversity of deep content choices, however, also makes it very easy for these “science publics” to only follow and participate at an aligned network of sites or blogs that reflect their worldviews, whether their preferred viewpoint be liberalism, conservatism, libertarianism, environmentalism, scientism, atheism, or fundamentalism.

As a result, science-related bloggers on the left and right who target these highly motivated yet selective publics can attract communities of users that rival legacy media in size and depth of participation.  This ideological selectivity is magnified by the increasing reliance by audiences on recommendations from  like-minded others at Facebook and Twitter.

For legacy media journalists, navigating and synthesizing the “echo chamber” nature of online science media can prove challenging.   As Andrew Revkin, who writes the Dot Earth blog at the New York Times, described his community of users:

They are sort of all over the map ideologically. The blog is very different than most in that most blogs are built to provide a comfort zone for a particular ideological camp, for liberals or conservatives or libertarians … what I do at Dot Earth is try to maintain an open forum where everyone can speak. I try – and sometimes fail – to maintain constructive discourse in the comments … And as a result it’s different. It’s a discomfort zone … I’m not here to provide you with a soft couch and free drinks if you’re an enviro or if you are a conservative. It’s a place to challenge yourself.

Mapping the New Science Media Ecosystem

With these trends in mind, we argue that a more suitable metaphor than the traditional transmission model of science journalism for describing this digital space is that of a “science media ecosystem,” drawing on respected technology journalist John Naughton’s description of a new media environment online. He wrote:

The new ecosystem will be richer, more diverse and immeasurably more complex because of the number of content producers, the density of the interactions between them and their products, the speed with which actors in this space can communicate with one another and the pace of development made possible by ubiquitous networking.

Applying this idea, the evolving science media ecosystem consists of legacy media in their print and online formats, including the Guardian and the New York Times; science blogging and aggregation sites, most notably; the news and blogging communities formed by journals such as ScienceNature and PLoS; the news and blogging communities formed by legacy science magazines including Discover and Scientific American; ideologically-driven advocacy blogs and sites such as PharyngulaClimate Progress and Climate Depot; and reflexive and meta-discussions of science journalism at MIT’s Knight Science Journalism Tracker and the Colombia Journalism Review.

Characteristic of this new science media ecosystem are innovative business models for producing science-related content which include “quasi-journalistic ventures set up by the scientific community” such as the communities at PLoS and Science; new ventures emanating from inside journalism such as the blogs and content features at the New York Times and the Guardian; and ‘developments in social networking and on the web which are both changing the way journalism is done and the way the public get their information’ such as In addition, there is a fourth model consisting of foundation funded, not-for-profit ventures such as the environment-focused sites Grist and Climate Central.

This rise in the numbers of actors and types of business models for producing science-related content has mirrored a decline in the numbers of science writers employed by legacy media in the US, with the workloads of the science reporters who remain increasing, with time-pressed reporters increasingly reliant on subsidies from scientific institutions, universities and public relations agencies to find material.

The US-based National Association of Science Writers (NASW) noted that its membership in 2010 fell by approximately 200, or almost 10 percent, in a year.  A report on science journalism in the UK found science reporting had been largely “spared the ravages of the US,” although “numbers employed had stagnated.” The report highlighted in particular concerns about a lack of investigative science reporting.

Changing Roles in the New Media Ecosystem

Changing journalist roles within the science media ecosystem reflect economic trends in the international news industry.  As Indiana University’s Mark Deuze describes “[news] workers compete for (projectized, one-off, per-story) jobs rather than employers compete for (the best, brightest, most talented) employees.” Since freelancing relies on maintaining multiple streams of income-related activity, the trend has driven an increase in the diversity of roles that a science journalist might pursue.

Examples of journalists performing the roles typologized at the opening to our study have always existed, but the distribution of journalists across categories has grown more diverse in recent years. This trend is pronounced in US science journalism, with Deborah Blum of the University of Wisconsin noting to us that the industry-wide move to freelancing has:

… driven our changing perception of what a science journalist is. A science journalist wears a lot of hats, the way I do … I write books, I do magazine articles, I teach – [this] is much more the twenty-first century version of a journalist.

In this section, we describe how the journalists we interviewed reflected on the different role categories outlined in our typology.

Conduits and explainers. Despite the imperatives for role diversity driven by the increased number of freelancers and the new online content features such as blogs featured at legacy media, a consistent theme among the journalists we interviewed was that the traditional role of reporting new scientific developments remained a cornerstone for their work.

Alok Jha of the Guardian noted that the main goal was reporting “what’s happening and what’s interesting. That’s the primary thing,” and he noted that other roles and functions flow from this primary reporting role. Charles Petit, a veteran science reporter and lead writer for MIT’s Knight Science Journalism Tracker, said science reporters “explain current events by asking scholars about them, and these tend to be scientists’.

Jha was careful to distinguish this reporting function from roles as “conduits” and “explainers.” Petit said the reporting role was previously “much more dominant among science writers” and “it remains important.”

Blum and Ed Yong, who writes the award-winning Not Exactly Rocket Science blog for Discover magazine, were among reporters who said a core feature of their writing was explaining science understandably to non-specialists. Yong said:

I think that area of science reporting often gets forgotten about in the mainstream. I’m not sure it’s as valued as strongly as – I don’t know – uncovering acts of fraud or misconduct or finding juicy human stories. I think the very simple act of making complex things simple is tremendously valuable. It’s essential for science journalism.

Curators of information. Interviewees generally agreed that sifting through and evaluating the vast amount of science-related content has become an increasingly prominent function for science reporters. The Guardian, for example, created Story Trackers, which trace the coverage and commentary on major science stories as they develop, with readers actively pointing out interesting coverage.  James Randerson of the Guardian said that, with so much science content available, curation is “about what it means to be a journalist in the digital age.” He said:

We made a very conscious decision to add value to stories by doing this kind of curation role, and basically admitting that we are not the fount of all knowledge, that we do have the ability to present information in a useful way and to hopefully decide which information is useful and which isn’t, and to try and bring in the information that’s good and present it in a way that’s meaningful, and to use our readers, our readership, and the people who are part of our community to help us in that task.

His colleague Jha said curation of stories where there are multiple angles and perspectives on the issue also allows for a more realistic portrayal of scientific work because “scientific papers when they are published are not the be all and end all. They are the start of a massive conversation.”

Curation is also an important function for producers of meta-discussions of science journalism carried out by, for example, The Observatory column at the Columbia Journalism Review. Its editor, Curtis Brainard, noted that curation was more than aggregation of content and adding value to stories is essential. He said:

It means informed or value-added aggregation. If you go to a museum, the curators don’t just put up a painting; they also put up a little sign next to it, explaining something about that work. That’s more what we do, that informed aggregation … We’re collecting headlines, but at the same time, we’re telling you why we’re recommending this story, or why we’re recommending you don’t read this other story.

David Roberts, a staff blogger with Grist, added that the volume of information has meant that “just about everyone online is being forced to play that role sometimes these days,” but for him, the curatorial role has moved to Twitter, which is “just a much handier tool for the job.”

Civic educators. While science journalists have traditionally been resistant to viewing their work as education, some interviewees noted that the limitless availability of space online allowed reporters to fulfill more an educational role. As Brainard told us:

Before digital media, the news was the news, and yesterday was ancient history. There was no efficient way to archive information for the public at your traditional news outlet. But now, the web has changed all that and so journalists need to be not only presenting the news, but they need to make pertinent background information readily accessible … the web allows us to do that. News outlets should almost develop these encyclopedias at their back end. The New York Times has done a great job on this.

Contextualized science reporting has an education function, according to Yong, not only promoting scientific achievements, but also showing “where scientists disagree, areas where controversies are going on, because that’s part of science, that’s an inescapable part of the scientific process … it shows people scientists are human and that science is a human process.”

Several journalists interviewed, however, were resistant or ambiguous about this role. Jha noted that it’s “a role that if it happens, then great … but it’s not the primary intention.” Mike Lemonick, formerly of Time magazine, now with Climate Central, and who teaches at Princeton University, said that most journalists have a strong resistance to the educator role:

Educators identify areas where knowledge is necessary, and provide it. An educator provides a discrete body of knowledge; they try and tell you everything about a certain subject, within limits of time. [Journalists] put educational content in a story in order to make news understandable. Another thing we do not do is assess what was learned.

Public intellectuals. Reporters in this role are similar to traditional newspaper commentators or columnists, moving frequently between specialized topics that they present from their distinctive worldview.  Several interviewees were resistant to being classified in this role, but John Horgan, who writes the Cross-check blog for Scientific American, contributes to science magazines and writes popular science books, is the interviewee who illustrates this role most clearly.

While working as a staff reporter for Scientific American in the 1990s, he said he ‘became dissatisfied’ with the constraints of traditional reporting and he wanted to undertake more opinion-based, interpretative reporting. He classified himself as a “critical debunker” and said he looks for “exaggerated or erroneous scientific claims” that he tries to question and debunk. Horgan said:

I convinced myself that that was actually a good thing to do because science had become such an authority that there was a need for a scientific critic … I just enjoy that form of journalism myself. It’s a paradox: it’s using subjectivity to ultimately get a more clear, objective picture of things.

Agenda-setters. Randerson said a distinct role for science reporters remained “being able to project the story … The readership and the influence of the Guardian are very important in terms of making a story acquire legs and really start moving and change what governments think.’

A form of agenda-setting is happening also through social media, with Revkin, for example, sending out his blog posts through Twitter to “sort of to test the idea and get it propagating.”

Brainard noted: “One thing that hasn’t been lost in the media is that desire to be first … We love it when we can get out with an analysis before anybody else and become the foundation on which all the following coverage is built.”

Watchdogs. Interviewees agreed they generally fulfilled the watchdog role, over scientific institutions and the scientific community, but also over individuals or groups making false scientific claims, and over social actors intervening in science policy discussions. A quote from Jha is representative: “We are playing watchdog, but on all sides, really.”

Conveners. Science reporters connect scientists with various publics to discuss science. Revkin said this was a major part of his current work, either online or in person. He said:

A big subset of posts that I do are along those lines. When I go places to speak, quite often I’ll be in the role of moderator or kind of convener … where I am on stage with four or five scientists or technologies or engineers or academics and challenging them in the same way as I do on the blog.


We approached this article as laying the groundwork for additional research examining the rapidly evolving science media ecosystem and, as a result, we recognize the limitations to our analysis.   We focused on elite media in the US and UK and future research might explore the extent to which a similar ecosystem exists in other countries and cultures. We chose to base this first part of our longer term study on elite media, rather than regional, local or community media, which may not have the resources or organizational capacity for its reporters to undertake the variety of roles outlined here.

The new science media ecosystem in the US and UK that we have mapped in this article – a mostly online environment that is deeply pluralistic, participatory and social – has presented challenges to the traditional professional role and working practices of the science reporter.  In this environment, journalists have moved from their dominant historical role as privileged conveyors of scientific findings to an increasing plurality of roles that involve diverse, pluralistic and interactive ways of telling science news.

The increasing plurality of roles has been driven also by the shifting economic and career conditions for science journalists, who are, with increasing number in the United States, working as freelancers.  The increase in role diversity is also a function of news organizations requiring their staff journalists to not only master various multimedia storytelling and newsgathering formats, but also report, write, create, and communicate across multiple mediums and in different formats.

The roles that are becoming increasingly prevalent are curator, convener, public intellectual and civic educator, roles that are underwritten by the essential skills of criticism, synthesis and analysis.

There remains, however, as described by our interviewees, a strong continuation of the traditional journalistic role conceptions of conduit and agenda-setter.  The traditional reporter role emerged in interviews as being more fundamental to online science journalists than we had anticipated at the outset of our research.

Journalists also strongly identified with the watchdog role, stressing that this meant they covered critically the scientific community itself, new scientific findings, challenges to scientific knowledge, science policy claims and, indeed, science journalism itself.

Yet, as several interviewees stressed, critical, interpretative, analytical reporting cut across several roles, suggesting to us that the structural, organizational and professional changes in the digital age have enabled science reporters to more generally fulfill the historically much hoped for roles of science critics and civic interpreters.

Despite the rise in advocacy journalism, none of the interviewees self-identified in the advocate role, though this likely reflects the absence of a professional advocate from the sample we were able to interview.   In addition, apart from some examples from established legacy media, the interviewed journalists did not self-identify strongly as investigative reporters.

Interviewees noted that legacy media had the resources and expertise to conduct investigative reporting, but, in the US at least, investigative work is now being carried out by, or in partnership with, non-profits, universities or philanthropically supported organizations, such as ProPublica or American University’s Investigative Reporting Workshop.

The trend toward non-profit models that have flowered among a collaborative network of investigative reporters has been comparatively slow to develop in similar fashion among science journalists.

Still, there are existing non-profit models in science journalism that future research should examine, including Climate CentralYale Environment 360, and the Yale Forum on Climate Change and the Media.   Yet these models stand as just six among what investigative reporter Charles Lewis has identified as more than 60 non-profit public affairs journalism initiatives at the national and local level in the US.

Given this growing population of ventures, future research should attempt to systematically account for the features and principles that can usefully inform the growth of non-profit science journalism.


Fahy, D., & Nisbet, M. (2011). The science journalist online: Shifting roles and emerging practices Journalism, 12(7), 778-793