Archive for the ‘Wetland mitigation’ Category

The metrics debate: habitat for middle-aged great blue herons who don’t like shrimp?

Sunday, April 22nd, 2012

Whenever discussions on biodiversity offsets get technical, they either focus on legal and cost issues (if you’re paying) or on their underlying ecological reality (if you’re the regulator). Concerning the latter, the question is how you actually assess equivalence between what is lost on the one hand, and what is generated by the offset on the other? So it’s all about what and how you measure to assess those gains and losses – hence the metrics debate.

In his blog, Morgan Robertson exposes this issue as a “paradox”.

I’ve been thinking about this for a long time — in fact it seems like everything I’ve ever written boils down to “defining environmental commodities is HARD because ecology is complex and commodities need to be abstract”.

The paradox is that the metrics must strike a difficult balance between their ecological precision and their ability to foster exchanges on a market for offsets.

Too much precision (i.e. the habitat for middle-aged great blue herons who don’t like shrimp of Robertson fame since 2004) might better reflect the complexities, or rather the ecological uniqueness, of each location (and time), being assessed. It would however make any market completely useless… At the other extreme, a metric that hardly encompasses these complexities (try wetland area) would make the market highly fungible.

This paradox should be on the mind of anyone developing metrics or methods for assessing ecological equivalence or credit-debit systems, or using them to actually design an offset scheme. The same applies to any type of ecosystem service market off course (PES or otherwise).

It is interesting to note that in their pilot schemes for testing habitat banking, France and the United Kingdom have made very different choices in terms of metrics. More on this later…

Key issues and solutions for designing and sizing biodiversity offsets

Friday, October 14th, 2011

Habitat loss through development is one of the major causes of biodiversity loss. The increasingly common legal requirement to first avoid, then reduce and, if necessary, offset impacts of plans and projects on biodiversity has however not always been appropriately enforced. The blame lies mainly in bad governance such as patchy monitoring or poorly defined liabilities. Biodiversity offsets also suffer from the lack of formal methods for designing and sizing offset requirements.

In a paper recently published in Biological Conservation, Fabien Quétier (who is involved in this blog) and Sandra Lavorel address this gap by reviewing the different tools, methods and guidelines that have been developed in different regulatory contexts to design and size biodiversity offsets.

They formulated a typology of approaches that variously combine the methods and guidelines reviewed and then discuss how these relate to the objectives of offset policies, the components of biodiversity and ecosystems to which they apply, and the key issues for ecological equivalence.

One of the key messages from the paper might be that when gains are not realistic, e.g. because we do not know how to enhance or restore a habitat or ecosystem function (i.e. they are non renewable), then protection of as-yet unprotected habitats or ecosystems is the only realistic offset option.

This has several consequences, the most notable being that, in effect, using protection as offset means we assign a ratio of acceptable loss to the remaining unprotected habitat or ecosystem. For example, protecting 3 hectares for every unprotected hectare lost actually means that we accept to loose a quarter of the unprotected area. This then means we must think strategically about what we want to do with that quarter… which is then a non renewable resource too!

New books on the shelves

Friday, September 23rd, 2011

Several new books on the topic of market-based instruments for nature conservation were recently published (or will soon be). Expect so see reviews here soon.

The first book is by Royal Gardner, a law specialist, who has worked on wetland mitigation in the USA. Entitled Lawyers, Swamps, and Money: U.S. Wetland Law, Policy, and Politics the book provides an in-depth look into the inner workings of the wetland mitigation “industry” and especially its governance. You can take a look on Amazon.

The second book is by Ece Ozdemioglu of the British consultancy EFTEC. It will provide guidance on ecological equivalency methods that can be applied to biodiversity offsets and payment for ecosystem service schemes. Here is what her personal page on the EFTEC website says:

Her next book (with Josh Lipton and David Chapman, forthcoming in 2011 by Springer) will be on the use of resource equivalency (including economic valuation) methods for assessing environmental damage and liability and selecting the appropriate compensation measures. This will help implement European Directives of Habitats, Wild Birds and Environmental Liability as well as input to new policy instruments like biodiversity offsetting, payments for ecosystem services and habitat banking.

According to Open Trolley, the expected publication date is September 29th, with the title “Equivalency Methods for Environmental Liability in the European Union: Assessing Damage and Compensation Under the Environmental Liability Directive”. Most of the contents probably reflect EFTEC’s work as part of the EU funded REMEDE project which provides lots of interesting insights.

If you haven’t read it yet, you can still have a look at Carroll, Fox and Bayon’s book on conservation and biodiversity banking published by EarthScan.

Key issues in developing rapid assessment methods of ecosystem state

Friday, February 25th, 2011

David K. Rowe and his colleagues from the National Institute of Water and Atmospheric Research of New Zealand have developed a rapid method to score stream reaches. In presenting their method, the go through several of the key steps (and difficulties) in developing such “rapid assessment methods”. We summarize these below:

  • Scores are often given in reference to a desirable state (or “best of a kind”). This is helpful for ensuring that all assessors share the same upper bounds in their assessment. Selecting reference sites is however a tricky task and in particular if assessments do not focus on an ecosystem as a whole but on separate “functions”. Rowe et al. (2009) raise the issue of artificial streams performing certain functions better than reference sites. In this case they argue that this “over-performance” should be ignored: the artificial stream should be given the maximum score – that of the reference – for the particular function.
  • The selection of variables is a key step in method development. It requires an understanding of the system being assessed or of the main drivers of the system’s state (i.e. a conceptual model of the system). The conceptual models can be tested using field data. As an example, Delaware’s DERAP method was built through multiple regression analysis of independent evaluations of wetland states with set of stressor variables against (on 250 different wetlands!).
  • Developing algorithms for combining several variables into single scores is where many methods fail to convince (see for ex. McCarthy et al. 2004). Algorithms can be tested against results from established methods or best professional judgement, using field sites or consensual reference sites for example. Alternatively, statistical models can be used to weight the variables (as in the development of DERAP).
  • Redundancy is unavoidable because of the interdependence of the many processes being assessed. Moreover, redundancy contributes to robustness in the face of user/assessor subjectivity. As an example, Florida’s UMAM method relies on best professional judgement but gives detailed guidelines through a list of criteria that are very redundant. The robustness of a method to user bias can be assessed through sensitivity analysis.
  • Once a method has been proposed, it must be revised and improved through testing and feedback from widespread use.
  • The team who developed California’s rapid assessment method (CRAM) also made recommendations concerning model development (available here). They offer a more formalized step-by-step process that includes several of the points raised by Rowe and his co-authors.

    Applying the mitigation hierarchy: where is the avoidance?

    Wednesday, February 16th, 2011

    In her 1996 paper, Barbara Bedford mentioned that wetland mitigation policies are in effect landscape-level policies for managing and distributing wetlands. In a paper* soon to be published in the journal Wetlands Ecology and Management, Shari Clare of the University of Alberta (Canada) and her co-authors make this point further by investigating if and how the mitigation sequence of avoiding, reducing and finally offsetting or compensating is applied in the province of Alberta.

    Through interviews with regulators, developers and actors of the wetland mitigation hierarchy they show that offsetting is systematically used to allow developers to get approval for their project. They argue that the requirement to avoid impacts is not well enforced in part because of:

  • the lack of guidelines on how to assess avoidance measures and alternatives while, in contrast, there are established guidelines for designing and sizing offsets)
  • the lack of a province-wide vision of where development could occur and where avoidance should be sought (i.e. land-use planning does not play its role)
  • the lack of recognized economic value of wetlands (i.e. their “use-value” is not taken into consideration in assessing equivalence)
  • the belief that wetland functions are easy to (re)create or restore (i.e. “techno-arrogance”).
  • To address these issues, the authors suggest watershed-based planning where wetlands are placed within a broader landscape context and alternative land-uses prioritized. This is consistent with the conclusions of Bedford (1996) who argued that project-centred regulation (i.e. command-and-control) is insufficient to reach the goal of no-net-loss of wetland functions. Shari Clare and her co-authors mention systematic conservation planning as one methodology for developing such watershed-level approaches. More generally, having a strategic vision for managing wetland resources at the provincial (or watershed level) is necessary for regulators to be proactive in the permitting process (rather than being reactive to developer requirements) and to effectively take into account cumulative effects (or many small impacts and wetland losses).

    The authors also add that wetland functions need to be better “valued” and suggest that social and economic values be explicitly incorporated into the assessment process. They suggest using the concept of ecosystem services to this end but not necessarily through a monetary valuation exercise. This raises complex assessment and accounting issues but is probably an effective avenue for both the public and developers to acknowledge the purpose of wetland mitigation policies and the option of avoiding impacts.

    Beyond the question of avoidance measures, the paper also gives some interesting (frightening?) insight into the design and sizing of offsets:

    In Alberta, all of the government regulators we interviewed indicated that the most common metric used for comparability or equivalency between impacted and compensatory wetlands is area, with very little consideration given to wetland functions or services.

    Having shown that the mitigation policy suffers from a lack of post-approval monitoring of offsets, the authors also argue for a stronger involvement of civil society in monitoring and control of offset actions: if public authorities are unable to follow-up on their decisions, then the easy solution is to get volunteers to do the work but perhaps that is too easy?

    To conclude, the paper is a very interesting contribution to the argument that, beyond developing adequate methodologies for assessing the equivalence between losses and gains in the context of offsets, the proper implementation of the mitigation hierarchy requires public authorities to be proactive about the goals in terms of wetlands, biodiversity, ecosystems etc. Being proactive means that a strategy must be formulated to managing these “resources” beyond each individual project.

    * Reference of the paper : Clare, S., Krogman, N., Foote, L; & Lemphers, N. (2011): Where is the avoidance in the implementation of wetland law and policy? Wetlands Ecology and Management, in press.

    The Tolstoy effect

    Sunday, December 12th, 2010

    In Anna Karenina, Tolstoy reminds us that “happy families are all alike” while “every unhappy family is unhappy in its own way”.

    Emilie Stander and Joan Ehrenfeld concluded that the same was true for wetlands. They studied the functioning of wetlands used as (supposedly “pristine”) reference wetlands for wetland mitigation in New Jersey (USA) and found that in this heavily urbanized setting, even reference wetlands were “unhappy”… (pdf here)

    This raises issues for using typologies of wetlands in the assessment of wetland states (as in the context of wetland mitigation in the USA).

    Identifying reference wetlands on the basis of standard structural indicators is misleading when wetlands are in heavily modified landscapes and watersheds. They suggest that instead, multi-year data on functioning should be used to create appropriate typologies of wetland functioning.

    A further step would be to use “theoretical” references for assessing wetland state but this would most likely make in-the-field assessment more difficult.