Posts Tagged ‘Ecological equivalence’

The metrics debate: habitat for middle-aged great blue herons who don’t like shrimp?

Sunday, April 22nd, 2012

Whenever discussions on biodiversity offsets get technical, they either focus on legal and cost issues (if you’re paying) or on their underlying ecological reality (if you’re the regulator). Concerning the latter, the question is how you actually assess equivalence between what is lost on the one hand, and what is generated by the offset on the other? So it’s all about what and how you measure to assess those gains and losses – hence the metrics debate.

In his blog, Morgan Robertson exposes this issue as a “paradox”.

I’ve been thinking about this for a long time — in fact it seems like everything I’ve ever written boils down to “defining environmental commodities is HARD because ecology is complex and commodities need to be abstract”.

The paradox is that the metrics must strike a difficult balance between their ecological precision and their ability to foster exchanges on a market for offsets.

Too much precision (i.e. the habitat for middle-aged great blue herons who don’t like shrimp of Robertson fame since 2004) might better reflect the complexities, or rather the ecological uniqueness, of each location (and time), being assessed. It would however make any market completely useless… At the other extreme, a metric that hardly encompasses these complexities (try wetland area) would make the market highly fungible.

This paradox should be on the mind of anyone developing metrics or methods for assessing ecological equivalence or credit-debit systems, or using them to actually design an offset scheme. The same applies to any type of ecosystem service market off course (PES or otherwise).

It is interesting to note that in their pilot schemes for testing habitat banking, France and the United Kingdom have made very different choices in terms of metrics. More on this later…

Key issues in developing rapid assessment methods of ecosystem state

Friday, February 25th, 2011

David K. Rowe and his colleagues from the National Institute of Water and Atmospheric Research of New Zealand have developed a rapid method to score stream reaches. In presenting their method, the go through several of the key steps (and difficulties) in developing such “rapid assessment methods”. We summarize these below:

  • Scores are often given in reference to a desirable state (or “best of a kind”). This is helpful for ensuring that all assessors share the same upper bounds in their assessment. Selecting reference sites is however a tricky task and in particular if assessments do not focus on an ecosystem as a whole but on separate “functions”. Rowe et al. (2009) raise the issue of artificial streams performing certain functions better than reference sites. In this case they argue that this “over-performance” should be ignored: the artificial stream should be given the maximum score – that of the reference – for the particular function.
  • The selection of variables is a key step in method development. It requires an understanding of the system being assessed or of the main drivers of the system’s state (i.e. a conceptual model of the system). The conceptual models can be tested using field data. As an example, Delaware’s DERAP method was built through multiple regression analysis of independent evaluations of wetland states with set of stressor variables against (on 250 different wetlands!).
  • Developing algorithms for combining several variables into single scores is where many methods fail to convince (see for ex. McCarthy et al. 2004). Algorithms can be tested against results from established methods or best professional judgement, using field sites or consensual reference sites for example. Alternatively, statistical models can be used to weight the variables (as in the development of DERAP).
  • Redundancy is unavoidable because of the interdependence of the many processes being assessed. Moreover, redundancy contributes to robustness in the face of user/assessor subjectivity. As an example, Florida’s UMAM method relies on best professional judgement but gives detailed guidelines through a list of criteria that are very redundant. The robustness of a method to user bias can be assessed through sensitivity analysis.
  • Once a method has been proposed, it must be revised and improved through testing and feedback from widespread use.
  • The team who developed California’s rapid assessment method (CRAM) also made recommendations concerning model development (available here). They offer a more formalized step-by-step process that includes several of the points raised by Rowe and his co-authors.

    Applying the mitigation hierarchy: where is the avoidance?

    Wednesday, February 16th, 2011

    In her 1996 paper, Barbara Bedford mentioned that wetland mitigation policies are in effect landscape-level policies for managing and distributing wetlands. In a paper* soon to be published in the journal Wetlands Ecology and Management, Shari Clare of the University of Alberta (Canada) and her co-authors make this point further by investigating if and how the mitigation sequence of avoiding, reducing and finally offsetting or compensating is applied in the province of Alberta.

    Through interviews with regulators, developers and actors of the wetland mitigation hierarchy they show that offsetting is systematically used to allow developers to get approval for their project. They argue that the requirement to avoid impacts is not well enforced in part because of:

  • the lack of guidelines on how to assess avoidance measures and alternatives while, in contrast, there are established guidelines for designing and sizing offsets)
  • the lack of a province-wide vision of where development could occur and where avoidance should be sought (i.e. land-use planning does not play its role)
  • the lack of recognized economic value of wetlands (i.e. their “use-value” is not taken into consideration in assessing equivalence)
  • the belief that wetland functions are easy to (re)create or restore (i.e. “techno-arrogance”).
  • To address these issues, the authors suggest watershed-based planning where wetlands are placed within a broader landscape context and alternative land-uses prioritized. This is consistent with the conclusions of Bedford (1996) who argued that project-centred regulation (i.e. command-and-control) is insufficient to reach the goal of no-net-loss of wetland functions. Shari Clare and her co-authors mention systematic conservation planning as one methodology for developing such watershed-level approaches. More generally, having a strategic vision for managing wetland resources at the provincial (or watershed level) is necessary for regulators to be proactive in the permitting process (rather than being reactive to developer requirements) and to effectively take into account cumulative effects (or many small impacts and wetland losses).

    The authors also add that wetland functions need to be better “valued” and suggest that social and economic values be explicitly incorporated into the assessment process. They suggest using the concept of ecosystem services to this end but not necessarily through a monetary valuation exercise. This raises complex assessment and accounting issues but is probably an effective avenue for both the public and developers to acknowledge the purpose of wetland mitigation policies and the option of avoiding impacts.

    Beyond the question of avoidance measures, the paper also gives some interesting (frightening?) insight into the design and sizing of offsets:

    In Alberta, all of the government regulators we interviewed indicated that the most common metric used for comparability or equivalency between impacted and compensatory wetlands is area, with very little consideration given to wetland functions or services.

    Having shown that the mitigation policy suffers from a lack of post-approval monitoring of offsets, the authors also argue for a stronger involvement of civil society in monitoring and control of offset actions: if public authorities are unable to follow-up on their decisions, then the easy solution is to get volunteers to do the work but perhaps that is too easy?

    To conclude, the paper is a very interesting contribution to the argument that, beyond developing adequate methodologies for assessing the equivalence between losses and gains in the context of offsets, the proper implementation of the mitigation hierarchy requires public authorities to be proactive about the goals in terms of wetlands, biodiversity, ecosystems etc. Being proactive means that a strategy must be formulated to managing these “resources” beyond each individual project.

    * Reference of the paper : Clare, S., Krogman, N., Foote, L; & Lemphers, N. (2011): Where is the avoidance in the implementation of wetland law and policy? Wetlands Ecology and Management, in press.