Here’s a link to a short article (and video) about the new study, “Hot fire, cool soil,” with a brief excerpt below. The American Geophysical Union demanded that we remove a copy of the actual study, which they provided me earlier in the day, from our website….so I’ve done that. Sorry folks.
When scientists torched an entire 22-acre watershed in Portugal in a recent experiment, their research yielded a counterintuitive result: Large, hot fires do not necessarily beget hot, scorched soil.
It’s well known that wildfires can leave surface soil burned and barren, which increases the risk of erosion and hinders a landscape’s ability to recover. But the scientists’ fiery test found that the hotter the fire—and the denser the vegetation feeding the flames—the less the underlying soil heated up, an inverse effect which runs contrary to previous studies and conventional wisdom.
Rather, the soil temperature was most affected by the fire’s speed, the direction of heat travel and the landscape’s initial moisture content.
And here’s the abstract:
Wildfires greatly increase a landscape’s vulnerability to flooding and erosion events by removing vegetation and changing soils. Fire damage to soil increases with increasing soil temperature and, for fires where smoldering combustion is absent, the current understanding is that soil temperatures increase as fuel load and fire intensity increase. Here, however, we show that this understanding that is based on experiments under homogeneous conditions does not necessarily apply at the more relevant larger scale where soils, vegetation and fire characteristics are heterogeneous. In a catchment-scale fire experiment, soils were surprisingly cool where fuel load was high and fire was hot and, conversely, soils were hot where expected to be cooler. This indicates that the greatest fire damage to soil can occur where fuel load and fire intensity are low rather than high, and has important implications for management of fire-prone areas prior to, during and after fire events.
This post is a followup to Matthew’s comment here:
Finally, the notion that judges should decide legal issues surrounding the Forest Service because they are not forestry experts is in interesting one. Fact is, judges aren’t necessarily experts in divorce, murder, theft, DUI’s, bribery, etc either….but they rule on cases involving these issues all the time. What federal judges are experts in is the laws of this nation, and unfortunately, too many times, the Forest Service fails to comply with those laws
It occurs to me that many of you might not be aware of the context around the broader issues of courtroom decisions and other kinds of expertise, including scientific expertise. Within the science and technology studies literature, you may find many folks who have studied scientific issues and how they are handled in the courts. Our world is not separable from that world. Dr. Sheila Jasanoff‘s book
“Science at the Bar:Law Science and Technology in America” is considered by many to be the fundamental work in this area, and well worth a read.
Unfortunately, I couldn’t easily lay my hands on my copy of the book, but I did find a review here in the New England Journal of Medicine.
SCIENCE AT THE BAR: LAW, SCIENCE, AND TECHNOLOGY IN AMERICA
By Sheila Jasanoff. 285 pp. Cambridge, Mass., Harvard University
Press, 1995. $29.95. ISBN 0-674-79302-1.
To many physicians, science in the courtroom means trouble. Science, it is claimed, deals with objective facts and theories whose validity can be judged only by those with lengthy training in scientific method. Law, in contrast, involves rules and regulations applied by judges and juries almost always lacking competence — and often demonstrating striking incompetence — in evaluating scientific evidence. Putting science at the bar is therefore an invitation to outrageous malpractice awards, to the testimony of charlatans who are taken seriously by juries, to the awarding of irrationally large damages in “toxic tort” cases in the absence of scientific evidence of toxicity, to lay interference in the conduct of medical research, and to a familiar litany of judicial errors.
But Sheila Jasanoff, in this broad-ranging and authoritative survey of the relation between law, science, and technology, presents a far more nuanced and complicated picture. Jasanoff, trained as a lawyer and subsequently the creator of Cornell’s flagship department of science and technology studies, has devoted most of her professional life to studying science in the courtroom. Her conceptual framework draws on the emerging field of science studies. In recent decades, this field has come to redefine science not simply as the discovery of the truths of nature, but also as a complex, problematic, error-prone, controversy-ridden process of “constructing” a view of the natural world that may, with luck, rhetorical skill, and time, eventually come to be accepted as mainstream, or “textbook” science.
Thus redefined, science begins to look much more like law. And the notion that the courts should simply ascertain “the established scientific view” appears, at least in many cases, similar to the search for a chimera. For as Jasanoff shows in her many examples, most of them involving biomedical matters, the courts are almost always called on in areas where empirical research is inconclusive, scientific opinion is divided, decisive epidemiologic studies have not been completed, or legislatures have not been willing to provide a framework to govern the application of — for example — new reproductive
and life-prolonging forms of technology. As she puts it, “courts, like regulatory agencies, conduct the bulk of their scientific inquiries ‘at the frontiers of scientific knowledge’ where claims are uncertain, contested and fluid, rather than against the background of largely settled ‘mainstream’ knowledge.”
Jasanoff concludes with suggestions for a “more reflective” alliance between law and science. She is well aware of the many egregious errors made by the courts. But in the end, she sees the relation of science and law in America as generally positive, as granting the legal system a “limited and highly
contingent ability to interrogate the scientific community,” as encouraging scientific “reflection and self criticism,” and in general, as encouraging the advance of science and ratifying the positive American view of science and technology. For any serious student of science and law in America, this is an original and essential book
Interestingly, in the search for the review, I also ran across this article by Dr. Jasanoff:
“Is science socially constructed—And can it still inform public policy?” Here is the link:
This paper addresses, and seeks to correct, some frequent misunderstandings concerning the claim that science is socially constructed. It describes several features of scientific inquiry that have been usefully illuminated by constructivist studies of science, including the mundane or tacit skills involved in research, the social relationships in scientific laboratories, the causes of scientific controversy, and the interconnection of science and culture. Social construction, the paper argues, should be seen not as an alternative to but an enhancement of scientists’ own professional understanding of how science is done. The richer, more finely textured accounts of scientific practice that the constructivist approach provides are potentially of great relevance to public policy.
which sounds interesting, and relevant to this blog. You can read the first two pages online but it costs $40 to read the whole thing. Published in 1996. Of course, Dr. Jasanoff does not work for a public university, but still..
A couple of interesting items from Roger Pielke, Jr.’s blog..
First he posted this guest post by Steven Rayner.
Planetary Boundaries as Millenarian Prophesies: A Guest Post by Steve Rayner
This is a guest post by Steve Rayner, Oxford University, and is distilled from a forthcoming book chapter that Steve has co-authored with Clare Heyward, also of Oxford University. The full citation is (and please see the original for the broader argument and references):
S. Rayner and C. Heyward, 2013 (in press). The Inevitability of Nature as a Rhetorical Resource, Chapter 14 in Kerstin Hastrup (editor), Anthropology and Nature (Routledge, London
You gotta think that a chapter entitled “the inevitability of nature as a rhetorical resource” will be of interest to us.
Check out the guest post here.
Below is an excerpt.
The rhetoric employed in the plenary sessions was especially striking in its efforts to establish the present as a uniquely defining moment for the future of humanity requiring urgent action on a global scale which seems slow in coming. Nobel laureate Elinor Ostrom declared that, “We have never faced a challenge this big.” Johan Rockström drove home the point claiming that “We are the first generation to know we are truly putting the future of civilization at risk.” Apparently, those who lived through the Second World War or the prospect of mutual nuclear annihilation in the 1960s were deluded in their estimation of the challenge they faced or the consequences for civilization, to say nothing of Old Testament prophets who only had the authority of God that destruction was imminent if people did not mend their wicked ways. Lest there be any doubt that behavioural change was the goal, Dutch political scientist Frank Biermann spelled out the imperative that “The Anthropocene requires new thinking” and “The Anthropocene requires new lifestyles.”
At first sight, the contemporary resurgence in catastrophist thinking might be understood as a response to improvements in our understanding of critical earth systems resulting from research-led improvements in scientific understanding. However, I have not been able to identify any new empirical studies to justify the claim that, “Although Earth’s complex systems sometimes respond smoothly to changing pressures, it seems that this will prove to be the exception rather than the rule.” (Rockström et al 2009:472). Leading ecologists have long suggested that the general assertions of systems theorists that “everything is connected to everything else” and “you can’t change just one thing” are actually less robust than is often claimed. It seems that most species in many ecosystems are actually quite redundant and can be removed without any loss of overall ecosystems character or function (e.g., Lawton 1991, but for a contrasting view, see Gitay et al 1996). While it is doubtless the case that there are many non-linear relationships in natural systems, it is another matter as to whether non-linearity dominates and whether we should, as a matter of course, expect to find tipping points everywhere. Indeed, a recent review challenges Rockström et al.’s claims, arguing that out of the planetary boundaries posited, only three genuinely represent truly global biophysical thresholds, the passing of which could be expected to result in non-linear changes (Blomqvist et al, 2012).
The same report also challenges the idea that the planetary boundaries constitute “non-negotiable thresholds”. The identification of the planetary boundaries is dependent on the normative assumptions made, for example, concerning the value of biodiversity and the desirability of the Holocene. Rather than non-negotiables, humanity faces a system of trade-offs – not only economic, but moral and aesthetic as well. Deciding how to balance these trade-offs is a matter of political contestation (Blomqvist et al, 2012:37). What counts as “unacceptable environmental change” is not a matter of scientific fact, but involves judgments concerning the value of the things to be affected by the potential changes. The framing of planetary boundaries as being scientifically derived non-negotiable limits, obscures the inherent normativity of deciding how to react to environmental change. Presenting human values as facts of nature is an effective political strategy to shut down debate.
I particularly liked the last line…
then there is a detailed discussion of Munich Re’s paper on thunderstorm.. here.
My favorite line is also the last..
Misleading public claims. An over-hyped press release. A paper which neglects to include materially relevant and contradictory information central to its core argument. All in all, just a normal day in climate science!
I’ll go post something on the blog that it’s not just climate science..I suspect it’s really all sciences. Something about a culture of competition where press releases run amok. But even broader than science, as Ron C. says in comment #10
The same behavior occurs in the world of advertising, where it has been seen that:
“The Large print giveth,
The Small print taketh away.”
Brooks Hays, a reporter with GIMBY, recently wrote an article that should generate some interest here, especially in context of some of the comments related to recent black-backed woodpecker articles. Below is a snip from the opening lead, which features some quotes from forest ecologist Chad Hanson. The article also includes perspectives from Richard Hutto, forest ecologist and director of the Avian Science Center at the University of Montana, and from myself. You can read the entire article here.
Last summer, talk of wildfires filled newspapers and dominated the headlines. Wildfires were “trending,” as they say.
Blazes were burning the western forests in record numbers, announced policy officials and reporters. Every news and science organization from USA Today to the National Oceanic and Atmospheric Administration (NOAA) was calling 2012′s fire season one of the worst on record.
“Records maintained by the National Interagency Fire Center (NIFC) and NASA both indicate that 2012 was an extraordinary year for wildfires in the United States,” NOAA wrote in a year-end review.
Weather Underground co-founder Jeff Masters blamed the growing threat of wildfire on “rising temperatures and earlier snow melt due to climate change” and added that “fire suppression policies which leave more timber to burn may also be a factor.”
In August, as fire season continued to rage in most of the West, National Public Radio ran a five-part series calling mega-fires the “new normal.” This new reality was attributed to excess forest growth — an overly abundant accumulation of combustible materials – all resulting from an overzealous Forest Service that put out too many fires. NPR dubbed it the “Smokey the Bear effect.”
But a growing body of empirical data suggests these superlatives might be more storytelling than science. “Those terms, ‘mega-fire’ and ‘catastrophic fire,’ are not scientific terms,” says forest ecologist Chad Hanson, executive director of the John Muir Project. “And such hyperbolic and extreme terms are not going to lead us to an objective view of the evidence.”
An objective view of the evidence, Hanson argues, reveals that the vast majority of wildlands and forests aren’t burning hotter and faster. They’re actually starved for high-intensity fires — fires Hanson says are more ecologically valuable than they’re given credit for.
As Hanson argues in his most recent study, The Myth of “Catastrophic” Wildfire, high-intensity fires are the exception in the U.S. today, not the norm. And he finds no correlation between increased fire-suppression activity and high-intensity fire. Hanson says the opposite is true: the longer a forest goes without fire, the more mature it becomes, the higher its canopy grows, and the less susceptible it is to fire damage.
Click here to read the entire article.
When we talk about how scientific information and scientists’ opinions should be used, we are part of a larger world of science and policy. And of course, climate change is popular for funding, so there are many more people to participate in discussions, which makes it interesting.
I thought this discussion on Roger Pielke, Jr.’s blog was appropriately current and relevant. If you haven’t been following the literature on planetary boundaries you can find some links. For me, as a person with experience with local land and people, it seems too abstruse to have any real world validity. However, it has triggered some interesting dialogue.. here are a couple of quotes.
From Melissa Leach, Director of the STEPS Centre at Sussex University:
This meeting – and many others like it in the run up to September – raise a significant question: Is there a contradiction between the world of the anthropocene, and democracy? The anthropocene, with its associated concepts of planetary boundaries and ‘hard’ environmental threats and limits, encourage a focus on clear single goals and solutions. It is co-constructed with ideas of scientific authority and incontrovertible evidence; with the closing down of uncertainty or at least its reduction into clear, manageable risks and consensual messages.
This is a far cry – as a South African participant pointed out – from some other worlds: on the ground in the global south and north, where people and social movements debate and contest their interests, values and desired futures; and the world according to democratic theory, in which such politics are worth acknowledging and respecting. In this world, there is a need to open up, make uncertainty and ambiguity and dissensus explicit, and foster diversity to cope with it.
From Nico Stehr:
Consensus on facts, it is argued, should motivate a consensus on politics. The constitutive social, political and economic uncertainties are treated as minor obstacles that need to be delimited as soon as possible – of course by a top-down approach. . . the discourse of the impatient scientists privileges hegemonic players such as world powers, states, transnational organizations, and multinational corporations. Participatory strategies are only rarely in evidence. Likewise, global mitigation has precedence over local adaptation. “Global” knowledge triumphs over “local” knowledge. . . the sum of these considerations is the conclusion that democracy itself is inappropriate, that the slow procedures for implementation and management of specific, policy-relevant scientific knowledge leads to massive, unknown dangers. The democratic system designed to balance divergent interests has failed in the face of these threats.
And this comment by Melissa Leach:
“Thanks for all the comments on Roger’s excellent blog; this is a vital debate and it’s great that it’s happening. Since he quoted me to kick things off, I’d like to throw in a few clarifications and thoughts.
I should make it clear that my Huffpost blog didn’t actually claim that there was anything inherently undemocratic about the concept of planetary boundaries (or indeed of the anthropocene). Rather, the focus of that piece was on the ways that the dynamics of a particular UN Expert meeting ‘closed down’ discussion of uncertainties, contestation, values and politics so that the end result was an apparently scientifically-authoritative/authoritarian set of messages conveyed to the SDG process. Nor am I claiming that the scientists involved in developing the concept are personally authoritarian in outlook. On the contrary. I know and work with many of them too. Indeed, as the longer version of the piece described, some were there at the Expert meeting and there was plenty of discussion in its early stages about uncertainties, politics, values, and the need for debate and dialogue, and bottom-up as well as top-down approaches. At least in part, the ant-democratic moves came in the ways the UN meeting managed and communicated its messages to the Open Working Group process.
Yet I also don’t think communication alone is to blame. There is a tendency for the concept of planetary boundaries to align rather neatly with approaches that are top-down not bottom up, set rather than deliberated, singular rather than respectful of diversity, privileging scientific over experiential expertise, global rather than local, control rather than response-oriented, and so on. It does so more than other candidate or related concepts – whether the ‘three pillars’, sustainability or sustainable development, or component dimensions such as climate change or biodiversity . There are plenty of reasons for this, and they relate to both scientific and political processes. But it does mean we have to keep a particular ‘watching brief’ on this concept-of-the-moment; one that is constructive and engaged, yet maintains the ability to contest and critique.
Ultimately, as our work in the STEPS Centre has often underlined, we need to be clear about means and ends; service and mastery. As long as planetary boundaries (like other technical concepts and frameworks) are seen as means to democratically-set ends; retained in service (rather than mastery) of political agency and used to open up (rather than close down) inclusive debate… then they’re part of the solution. And powerful parts at that. Otherwise, they risk confounding not only democracy, but the problems themselves.
But check out the post and the comments… there are some quotes by G. K. Chesterton and by Eisenhower.
I particularly liked Roger’s comment #12..
Thanks much … I guess that I see the proposals as more than think pieces. The scientists involved are actively working to secure a seat at the table and exercise influence based on their proposals — this from just yesterday:
My critique here has nothing to do with “relativism” related to truth — as you know 350 ppm and 2 degrees are political rather than scientific boundaries. Climate change is real, but that fact has nothing to do with whether we chose authoritarian or democratic responses.
Going back to our world, we can want to have sustainable forests and communities, but claiming inappropriate legitimacy for scientific information and scientists’ opinions, is, as Roger says for climate, a separate issue.
One more reflection on my week. I attended a seminar relating to a disease that a family member suffers from. The speaker said “no one supports research on (this chemical) for this disease because the pharmaceutical companies fund research and this is not their product.” Maybe it’s easier to see that what gets funded gets studied, and who controls the funding controls the ultimate information when you are in Health World. Hence the need for a People’s Research Agenda.
According to new research, lead by researchers at the University of Arizona, trees killed in the wake of mountain pine beetle infestations in Colorado have released less carbon into the atmosphere than expected. Read about the research and hear from the scientists in this article from the University of Arizona, excerpts of which are also highlighted below. And High Country News wins the award for best headline of the day, “Good news for people who love bad news,” which contains even more information about the new research. What does this new scientific research say about the validity of the oft-repeated claims from the timber industry and others that we have to cut down our forests so that we can”lock up” that carbon in 2 x 4′s?
Massive tree die-offs release less carbon into the atmosphere than previously thought, new research led by the University of Arizona suggests. Across the world, trees are dying in increasing numbers, most likely in the wake of a climate changing toward drier and warmer conditions, scientists suspect. In western North America, outbreaks of mountain pine beetles (Dendroctonus ponderosae) have killed billions of trees from Mexico to Alaska over the last decade.
Given that large forested areas play crucial roles in taking carbon dioxide out of the atmosphere through photosynthesis and turning it into biomass, an important question is what happens to that stored carbon when large numbers of trees die.
“The general expectation we had was that when trees die on a large scale, it would lead to a big pulse of carbon into the atmosphere through microorganisms metabolizing all that dead wood,” said David Moore, an assistant professor in the School of Natural Resources and the Environment in the UA College of Agriculture and Life Sciences and one of the lead authors of the study, which is published online in the journal Ecology Letters.
“A question we are looking to answer is, ‘How does the carbon dioxide released from the forest into the atmosphere change as you have large scale tree mortality over time?”’ said second lead author Nicole Trahan, a postdoctoral researcher at the University of Colorado, Boulder.
According to co-author Russell Monson, who is the Louise Foucar Marshall Professor in the UA School of Natural Resources and the Environment, forests affect the carbon budget of the atmosphere through two dominant processes: photosynthesis, by which plants take carbon dioxide out of the atmosphere and lock it up in organic compounds, and respiration, by which plants and soil microbes release carbon dioxide back into the atmosphere. The balance of these processes determines whether a particular forest is a carbon source or a carbon sink.
After a massive tree die-off, conventional wisdom has it that a forest would go from carbon sink to carbon source: Since the soil microbes are still around, they are expected to release large amounts of the greenhouse gas carbon dioxide into the atmosphere, where it is thought to accelerate climate change.
“Surprisingly, we couldn’t find a big pulse,” said Moore, who is also a member of the UA Institute of the Environment.
Trahan added: “In the first few years after beetles have come in and killed trees, the carbon release from the surrounding soil actually goes down.”
Large amounts of dead trees, it turns out, hold on to their carbon for a long time and prevent it from quickly being released into the soil or the atmosphere. According to Moore, this might be due to several reasons: First, while trees take up carbon dioxide during the day during photosynthesis, they release some of it at night when they switch to respiration.
“Once the trees are dead, respiration by the trees goes away,” Moore said. “In addition, if you cut off the carbon that a tree put into the soil while it was alive, you reduce the ability of the soil microbes around the roots to respire.”
“After five or six years, there is a buildup of some dead plant material, leaf litter and so on, and that seems to drive the rate of respiration up again. But it never recovers to the point it was before the beetles killed the trees, at least over the span of a decade,” Moore said.
Finally, the trees studied in this project grow at higher elevations, where cooler temperatures slow the decomposition process and thereby carbon-releasing respiration.
“Overall, we discovered that after a tree die-off, the loss of carbon in the soil results less from increased respiration by microbes but more from the fact that trees are no longer sequestering photosynthesized carbon into the soil,” Moore said. “There seems to be a dampening of the carbon cycle rather than a big pulse of carbon release. So even if the forest now goes from a sink to a source of carbon dioxide, it’s not as dramatic of an effect as we thought it would be.”
Previously, we’ve discussed and debated the Willamette National Forest’s proposed Goose timber sale, especially as it relates to the fact that many local residents in the McKenzie Bridge area of Oregon knew nothing of the Forest Service’s plans to log 7,600 logging trucks full of trees from what amounts to their neighborhood.
According to the Eugene Register Guard, a federal judge has put the McKenzie Bridge timber sale on hold, ordering the Forest Service to prepare an environmental impact statement. At the end of the article you’ll notice that this logging project would reduce 13% of the Lookout Mountain Potential Wilderness Area, in a part of central Oregon that’s already heavily logged and roaded. Besides, logging to reduce potential Wilderness is, like, so, late 70s/early 80s. Hey Forest Service, get with 21st Century already.
A federal judge has ruled that the U.S. Forest Service cannot go forward with a controversial logging project near McKenzie Bridge until an environmental impact statement has been prepared.
People living near the 2,100-acre Goose Project had opposed strongly the logging plans. They said there had been insufficient notice about the project and that they didn’t find out about it until it was too late for them to weigh in.
Cascadia Wildlands and Oregon Wild, represented by the Western Environmental Law Center, filed a lawsuit last May challenging the project.
In a ruling dated Thursday and made available late Tuesday, U.S. District Judge Ann Aiken said the timber sale could have a “potentially significant effect” on the environment. As a result, the Forest Service erred in choosing a less stringent environmental assessment, rather than a more demanding environmental impact statement, to assess the potential effects of harvesting an estimated 38 million board feet of lumber from federal land….
Aiken’s ruling will likely be embraced by McKenzie Bridge area residents who said they didn’t learn about the pending timber sale and harvest until last spring. Those residents had little recourse because they didn’t have legal standing to challenge the sale — unlike Cascadia Wildlands and Oregon Wild, who did have such standing because they were the only parties to have appealed the project back when it was approved in 2010.
Doug Heiken of Oregon Wild said Tuesday that the ruling is a victory for local residents who will now have a much greater opportunity to be heard on the matter. That’s because an environmental impact statement requires greater public participation. “The public gets to comment, so that the decision-maker has the benefit of that information and can make a fully informed decision,” Heiken said.
Heiken said the Forest Service “had the chance to get it right a couple of times and stumbled.” The agency could have limited the proposed sale to the noncontroversial thinning of dense young timber stands, but instead opted to include the proposed logging of mature forests and logging near riparian areas, he said.
The agency again made a misstep when it decided against inviting public comment after local residents learned of the proposed sale last year, he said. “The Forest Service still had the discretion to do that and avoid this lawsuit,” he said….
Aiken said that the project would reduce the 9,664-acre Lookout Mountain Potential Wilderness Area by 1,249 acres — resulting in the harvesting of 680 acres of timberland and fragmenting an additional 569 acres from the rest of the potential wilderness area. In addition to the number of acres logged, the project also would authorize the construction of eight miles of temporary roads and one mile of permanent road, the judge noted.
Here is the press release from the plaintiffs.
Recently, the discussion of collaboration and forest planning – at least on this blog – has focused on the processes at play with the Nez Perce-Clearwater National Forest forest plan revision. See here, here and here. The discussion and debate continues, as we can clearly see in these point/counter-point guest columns, which recently ran in the Moscow-Pullman Daily News. The first one is from Lee Rozen, who wrote his piece on behalf of the Moscow-Pullman Daily News editorial board. The second piece is from Gary Macfarlane, ecosystem defense director for Friends of the Clearwater.
We have this Friend, see, who’s stumped us
By Lee Rozen, for the Moscow-Pullman Daily News editorial board
Sometimes, you have to wonder whether the Friends of the Clearwater can see the national forest for the trees. Generally speaking, we agree with much of what the Friends, and groups like them, stand for. We are skeptical whenever industry or local government tells us to just trust them because they are acting in our best interest.
But when government in the form of the U.S. Forest Service comes forward and seeks the informed advice of a wide variety of groups – both industry and Friends included – we don’t see sinister conspiracies lurking behind the next stump. Challenges, perhaps, but not conspiracies.
The Forest Service is trying a “collaborative” process to develop a management plan for the newly combined administration of the Nez Perce-Clearwater National Forest. That’s instead of proposing a plan and offering it up for industry, governments, recreationists and environmentalists to take potshots at, and eventually go to court over.
It seems that’s what Friends of the Clearwater want them to do. The Friends seem to be afraid their principles will be co-opted if they sit down across the table from industry, government, hunters, motorcyclists and ski-mobilers and negotiate the best way to reach an acceptable compromise on the use of this forest. More practically, they argue that industry and government can pay to have their representatives at the discussions but groups of volunteers with day jobs can neither afford the time off nor the travel expenses. As a result, the Forest Service has offered to help the collaboration occur online.
A plan for running a national forest poses a complex problem because it is so unclear what national forests are and what they should be. It’s pretty clear what’s intended for national parks, wilderness areas and national recreational areas. But national forests are different. They are supposed to support a mix of goals, many of which can be contradictory – logging and recreation, for instance. The Friends of the Clearwater, and other interest groups, should be working to make the “collaborative process” work for all, rather trying to shoot it down in hopes of “total victory” in the courts.
US Forest Service must follow the law
By Gary Macfarlane
Lee Rozen’s criticism of Friends of the Clearwater (Our View, written for the editorial board, March 13) is off base, misinformed and reflects a lack of understanding concerning our public land laws and the public involvement process. Had he contacted us, he would have learned why we believe the Forest Service is not following the law. It appears the agency has stumbled into a quagmire, under the guise of collaboration, with its new forest planning process.
The process the Forest Service is currently following on the Nez Perce-Clearwater National Forests plan revision circumvents existing law, creates a contradictory and confusing public involvement process and lacks accountability. For 40 years, the National Environmental Policy Act has governed public input and analysis of agency proposals. NEPA mandates that the first step of the public involvement process is to identify pertinent issues. However, this collaborative process is seeking to resolve issues before the genuine public involvement process even begins. How can the Forest Service resolve issues before they are properly identified?
Under NEPA, all citizens can participate equally. However, the new collaborative forest plan revision process – which has no statutory authority – creates two unequal classes of citizens. The E-collaborative invention funnels citizen comments from the second class through the first class citizen collaborative group. Why should a special working group have more input and be allowed to determine whether or how other citizen comments are used?
Furthermore, NEPA requires an objective analysis of alternatives before decisions are made. Thus, the integrity of NEPA is compromised when the agency reaches a deal or understanding with the collaborative forest planning group before the NEPA process even begins. NEPA must be more than a pro forma exercise. Can you imagine having a collaborative group decide the outcome of an election before the election begins in order to avoid the contentiousness of elections?
Another stated reason behind the new forest planning process is to save time and money. How is having two competing public involvement processes for national forest planning more efficient? Indeed, the Forest Service recently admitted the collaborative process would take longer than anticipated. We feel that such redundancy wastes time and money and also creates conflict and confusion. In fact, a member of the forest planning collaborative for the Nez Perce-Clearwater National Forests – Jonathan Oppenheimer of the Idaho Conservation League – recently termed the process as collective collaborative confusion at a presentation given in Eugene, Ore. Even proponents of collaboration find the new process fatally flawed.
Retired Forest Service fishery biologist and Moscow resident Al Espinosa stated in a comment letter on the new process, “The intent here is to avoid accountability by eliminating the appeal process and providing a phony pathway around the regulations and laws.”
He also noted the new planning process would circumvent the national interest. Removing accountability and de-legitimizing NEPA’s public involvement and decision-making process is not in the public interest. The Forest Service could have prevented scrutiny, confusion and distrust had the agency followed citizen suggestions made in an October meeting in how to lawfully proceed with the forest plan revision process.
If national forest management is to be determined by local collaborative groups, then existing laws like NEPA need to be repealed first. If the goal is to remove the ability of citizens to have judicial redress and to challenge agency decisions in court, then the Constitution must be amended. The new process for national forest planning clashes with the law. Friends of the Clearwater simply believes the Forest Service should be accountable to U.S. citizens and the law. We think the majority of Americans would agree with us.
Missoula, MT – The Canada lynx was listed as threatened with extinction under the Endangered Species Act (ESA) in March 2000, yet the U.S. Fish and Wildlife Service has yet to complete the required recovery plan to ensure the survival of the elusive cat.
Today, a coalition of wildlife advocacy groups dedicated to the long-term survival and recovery of lynx filed a lawsuit to compel the Agency to complete a recovery plan to bring the species back from the brink of extinction. Threats to the lynx include loss of habitat and connectivity from improper forest management, development, and climate change, and mortality from starvation, predation, poaching, and incidental trapping.
The goal of the ESA is to prevent the extinction of and to provide for the eventual de-listing of imperiled species. As such, the U.S. Fish and Wildlife Service is required to adopt and implement recovery plans for all listed species that describe the specific actions needed to achieve de-listing, include measurable criteria, and estimate the time and costs required to achieve recovery goals.
“Recovery plans are one of the most important tools to ensure a species does not go extinct,” said Matthew Bishop, an attorney with the Western Environmental Law Center in Helena who is representing the wildlife advocacy groups in the case. “The ESA-mandated plan provides a road map to eventual de-listing by laying out what needs to happen and how best to get there,” added Bishop.
“Lynx will never fully recover in Montana and throughout the rest of their range in the lower 48 states until state and federal agencies have coordinated, concrete conservation actions designed to promote their recovery,” said Arlene Montgomery, Program Director of Friends of the Wild Swan. “Recovery plans are vital to ensuring that lynx not only persist, but thrive. They address the threats and provide the strategy that will lead to recovering lynx that builds upon the Endangered Species Act listing and designation of critical habitat.”
“Offering the Canada lynx protection under the Endangered Species Act absent a Recovery Plan, the Service merely created a paper tiger,” explained Duane Short, Wild Species Program Director for Biodiversity Conservation Alliance. “Its legal obligation to develop and implement a Recovery Plan is intended to produce meaningful actions that will actually enhance long-term survival of the species. Listing the lynx as Threatened under the Act, absent a Recovery Plan, is a job left undone.”
“The lynx’s recovery continues to be hampered by a ‘business as usual’ mentality from the federal and state agencies,” added Bishop. “Recent data suggests the lynx population in Montana may be in decline and yet, we’re still seeing development, trapping and snaring, roads, and industrial logging projects – including clear cuts – in some of the last remaining areas still occupied by lynx, including protected critical habitat” said Bishop. “Coordination among the various entities at the federal, state, and local level is needed to address the cumulative effects of these activities on lynx and their habitat. This is exactly what a federal recovery plan can do.”
The Western Environmental Law Center is representing Friends of the Wild Swan, Rocky Mountain Wild, San Juan Citizens Alliance, and the Biodiversity Conservation Alliance.
The Wild Nature Institute has produced a new video, “Forests Born of Fire.” Western US forests burned by high-intensity fire are important and rare wildlife habitat – but widespread policies of salvage logging and logging intended to prevent the likelihood of fire on private and public lands harms this habitat. a
The video was filmed in burned forests of the Lassen National Forest of California. The idea was conceived, the script written, the footage gathered, and the video narrated and edited entirely by biologists studying wildlife that use burned forests. Read more about WNI’s work to study and protect wildlife in burned forests.