Archive for the ‘Ecosystem assessment’ Category

Ecometrica’s Normative Biodiversity Metric: is it really a good idea?

Sunday, February 12th, 2012

Ecometrica, a Scottish consultancy, just wrote up guidelines for a new biodiversity metric. The Normative Biodiversity Metric (NBM) uses an interesting shortcut between “pristine” land and biodiversity to assess the overall land-holdings of the organization being assessed.

Because the metric uses widely applicable classes of “pristiness”, it can itself be widely applied, at various spatial scales. In fact, NBM relies on existing mapped data concerning land-use and land-cover. This wide applicability is the metric’s main strength.

In trying to apply concepts and ideas developed for green house gas emissions (GHG) to biodiversity, Ecometrica has chosen to simplify the later to a single easy to use metric. Why not? That choice does however raise the issue of over-simplification. When does “pristine” actually equate biodiversity and is that particular biodiversity the most relevant one to consider in assessing an corporation’s impact?

The NBM is designed to provide an equivalent to corporate GHG assessment, for biodiversity impact.

The documentation shows that the metric can incorporate additional field information, e.g. from surveys of the species or habitats that are actually present on-site. Yet, it is clear that the metric was developed to avoid field surveys as much as possible:

the biodiversity assessment methodology cannot be wholly dependent on the use of ecological surveys carried out by experts

Is that really a good idea? As usual, it depends what you use the metric for…

Happy New Year, with new environmental regulations in France

Monday, January 2nd, 2012

At the close of 2011, the French government finally published its new regulations concerning environmental impact assessment and public consultations. It’s a nice Christmas present… and these changes will play a defining role in the new year.

  • Décret n° 2011-2018 du 29 décembre 2011 portant réforme de l’enquête publique relative aux opérations susceptibles d’affecter l’environnement
  • Décret n° 2011-2019 du 29 décembre 2011 portant réforme des études d’impact des projets de travaux, d’ouvrages ou d’aménagements
  • These regulations will be applicable as of June 1st. They are bringing about considerable change in the way biodiversity and ecosystems will be taken into account in development projects and land planning. We will discuss these changes here in the coming weeks.

    The UK national ecosystem assessment is out!

    Wednesday, June 8th, 2011

    The UK National Ecosystem Assessment was finalized and is being published on-line.

    Started mid 2009, the assessment led by Robert Watson and Steve Albon, it is the first analysis of the UK’s natural environment in terms of the benefits it provides to society and continuing economic prosperity.

    The key findings of the assessment were made available on June 2nd (pdf here) while specific technical chapters will be made available through June.

    Until then the 87 pages of the synthesis report should keep you busy! Below are some of the main points raised by the assessment:

    The authors mention the need to increase food production while at the same time decreasing its negative effects on ecosystem services. In fact, the idea is to harness ecosystem services to actually increase production. This “sustainable intensification” is what the French call “ecological intensification”.

    Reversing declines in ecosystem services will require the adoption of more resilient ways of managing ecosystems, and a better balance between production and other ecosystem services – one of the major challenges is to increase food production, but with a smaller environmental footprint through sustainable intensification.

    Not surprisingly, the assessment also raises the issue of ecosystem services being undervalued in decision making and the suggested solution is to take into account the monetary and non monetary values of ecosystems in every-day decision making.

    Contemporary economic and participatory techniques allow us to take into account the monetary and non-monetary values of a wide range of ecosystem services.

    The assessment use six contrasting scenarios to explore alternative futures for ecosystem services in the UK.

    The six scenarios used in the UK national ecosystem assessment

    Choose yours!

    It is also worth noticing that the assessment’s conceptual framework seems to focus on the “goods” that depend (at least in part) on ecosystem services as the linkage between ecosystems and human well-being. A more in-depth look into the figure below shows that in fact, the authors have grouped under the label “goods” all use and non-use, material and non-material benefits from ecosystems that have value for people.

    The conceptual framework of the UK national ecosystem assessment

    Key issues in developing rapid assessment methods of ecosystem state

    Friday, February 25th, 2011

    David K. Rowe and his colleagues from the National Institute of Water and Atmospheric Research of New Zealand have developed a rapid method to score stream reaches. In presenting their method, the go through several of the key steps (and difficulties) in developing such “rapid assessment methods”. We summarize these below:

  • Scores are often given in reference to a desirable state (or “best of a kind”). This is helpful for ensuring that all assessors share the same upper bounds in their assessment. Selecting reference sites is however a tricky task and in particular if assessments do not focus on an ecosystem as a whole but on separate “functions”. Rowe et al. (2009) raise the issue of artificial streams performing certain functions better than reference sites. In this case they argue that this “over-performance” should be ignored: the artificial stream should be given the maximum score – that of the reference – for the particular function.
  • The selection of variables is a key step in method development. It requires an understanding of the system being assessed or of the main drivers of the system’s state (i.e. a conceptual model of the system). The conceptual models can be tested using field data. As an example, Delaware’s DERAP method was built through multiple regression analysis of independent evaluations of wetland states with set of stressor variables against (on 250 different wetlands!).
  • Developing algorithms for combining several variables into single scores is where many methods fail to convince (see for ex. McCarthy et al. 2004). Algorithms can be tested against results from established methods or best professional judgement, using field sites or consensual reference sites for example. Alternatively, statistical models can be used to weight the variables (as in the development of DERAP).
  • Redundancy is unavoidable because of the interdependence of the many processes being assessed. Moreover, redundancy contributes to robustness in the face of user/assessor subjectivity. As an example, Florida’s UMAM method relies on best professional judgement but gives detailed guidelines through a list of criteria that are very redundant. The robustness of a method to user bias can be assessed through sensitivity analysis.
  • Once a method has been proposed, it must be revised and improved through testing and feedback from widespread use.
  • The team who developed California’s rapid assessment method (CRAM) also made recommendations concerning model development (available here). They offer a more formalized step-by-step process that includes several of the points raised by Rowe and his co-authors.

    The science of IPBES in Science…

    Friday, February 18th, 2011

    Science magazine published today a policy article about the challenges facing the recently launched Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES).

    The authors, all of which are key players in the international earth systems science partnership, present the three key challenges for IPBES to reach its goals:

  • Strengthening the science
  • Strengthening assessments
  • Strengthening policy relevance
  • In discussing these challenges, they insist on the need to broaden their partnerships and in particular to seek more input from the policy community, from less developed nations and the social sciences.

    Greater involvement by social sciences is justified by the need to better incorporate “values” of biodiversity and ecosystem services in order to strengthen policy relevance. The authors argue that this valuation step was missing in the 2005 Millennium Ecosystem Assessment but put forward by the TEEB initiative in 2010. Let us hope that the IPBES will keep a critical outlook on this agenda.

    Applying the mitigation hierarchy: where is the avoidance?

    Wednesday, February 16th, 2011

    In her 1996 paper, Barbara Bedford mentioned that wetland mitigation policies are in effect landscape-level policies for managing and distributing wetlands. In a paper* soon to be published in the journal Wetlands Ecology and Management, Shari Clare of the University of Alberta (Canada) and her co-authors make this point further by investigating if and how the mitigation sequence of avoiding, reducing and finally offsetting or compensating is applied in the province of Alberta.

    Through interviews with regulators, developers and actors of the wetland mitigation hierarchy they show that offsetting is systematically used to allow developers to get approval for their project. They argue that the requirement to avoid impacts is not well enforced in part because of:

  • the lack of guidelines on how to assess avoidance measures and alternatives while, in contrast, there are established guidelines for designing and sizing offsets)
  • the lack of a province-wide vision of where development could occur and where avoidance should be sought (i.e. land-use planning does not play its role)
  • the lack of recognized economic value of wetlands (i.e. their “use-value” is not taken into consideration in assessing equivalence)
  • the belief that wetland functions are easy to (re)create or restore (i.e. “techno-arrogance”).
  • To address these issues, the authors suggest watershed-based planning where wetlands are placed within a broader landscape context and alternative land-uses prioritized. This is consistent with the conclusions of Bedford (1996) who argued that project-centred regulation (i.e. command-and-control) is insufficient to reach the goal of no-net-loss of wetland functions. Shari Clare and her co-authors mention systematic conservation planning as one methodology for developing such watershed-level approaches. More generally, having a strategic vision for managing wetland resources at the provincial (or watershed level) is necessary for regulators to be proactive in the permitting process (rather than being reactive to developer requirements) and to effectively take into account cumulative effects (or many small impacts and wetland losses).

    The authors also add that wetland functions need to be better “valued” and suggest that social and economic values be explicitly incorporated into the assessment process. They suggest using the concept of ecosystem services to this end but not necessarily through a monetary valuation exercise. This raises complex assessment and accounting issues but is probably an effective avenue for both the public and developers to acknowledge the purpose of wetland mitigation policies and the option of avoiding impacts.

    Beyond the question of avoidance measures, the paper also gives some interesting (frightening?) insight into the design and sizing of offsets:

    In Alberta, all of the government regulators we interviewed indicated that the most common metric used for comparability or equivalency between impacted and compensatory wetlands is area, with very little consideration given to wetland functions or services.

    Having shown that the mitigation policy suffers from a lack of post-approval monitoring of offsets, the authors also argue for a stronger involvement of civil society in monitoring and control of offset actions: if public authorities are unable to follow-up on their decisions, then the easy solution is to get volunteers to do the work but perhaps that is too easy?

    To conclude, the paper is a very interesting contribution to the argument that, beyond developing adequate methodologies for assessing the equivalence between losses and gains in the context of offsets, the proper implementation of the mitigation hierarchy requires public authorities to be proactive about the goals in terms of wetlands, biodiversity, ecosystems etc. Being proactive means that a strategy must be formulated to managing these “resources” beyond each individual project.

    * Reference of the paper : Clare, S., Krogman, N., Foote, L; & Lemphers, N. (2011): Where is the avoidance in the implementation of wetland law and policy? Wetlands Ecology and Management, in press.

    Biodiversity: the new carbon?

    Tuesday, February 8th, 2011

    The Guardian, a leading UK newspaper, recently published an interesting analysis of biodiversity as the new “carbon”. After discussing how biodiversity has emerged as a new issue for companies to incorporate in their business strategies, the article details the main motivations for this.

    The first motivation mentioned is reputational risk but the most interesting is the one concerning a company’s liability in case of impacts or damages on biodiversity. The Deepwater Horizon oil spill (BP) is used as an example. This underlies two things:

  • That the “business case” for incorporating biodiversity in business decisions and strategies is strongly dependant on an appropriate institutional context where companies are liable for impacts on biodiversity. The requirement to avoid, reduce and offset impacts is one such context.
  • That anticipating possible impacts and the potential financial losses that could result from such impacts requires the development of impact assessment procedures and methods that can be parametrized in advance.
  • The USA have developed assessment methods to be applied in the context of Natural Resource Damage Assessment procedures (NRDA), such as Habitat Equivalency Analysis and Resource Equivalency Analysis. In Europe, the 2004 Environmental Liability Directive will certainly make governments and environmental authorities push for the development of such methods.

    Oil palm expansion in Indonesia: the case for trade-off analyses of ecosystem services

    Thursday, January 13th, 2011

    In a paper published in the Proceedings of the National Academy of Sciences of the USA (PNAS), Lian Pin Koh and Jaboury Ghazoul present a modelling framework for analysing trade-offs between palm oil production, biodiversity conservation and carbon sequestration.

    Informing policy-makers about these trade-offs is essential in the face of rapidly expanding plantations and the newly established REDD mechanisms (with a possible wildlife premium as discussed here).

    Using a scenario-based approach, the authors assessed the consequences of alternative pathways of oil palm expansion on the area of primary and secondary forests, on forest biodiversity (modelled using species-area models), carbon stocks (in biomass and peat soils) and annual rice production capacity. They show that biodiversity and forest conservation are compatible with the expansion of oil palm production, through appropriate selection of planted areas.

    Our results suggest that the environmental and land-use tradeoffs associated with oil-palm expansion can be largely avoided through the implementation of a properly planned and spatially explicit development strategy

    This rosy conclusion is tempered by the acknowledgement that striking the balance between the goals of biodiversity conservation, carbon sequestration and palm oil production will require the expansion of oil palm plantations to be capped. Are we really willing to make this “sacrifice”?

    The paper by Lian Pin Koh and Jaboury Ghazoul was critiqued by Sean Sloan and Nigel Stork (also in PNAS) for ignoring several spatial processes such as the aggregation of plantations. Lian Pin Koh and Jaboury Ghazoul downplayed the critique and argued for the usefulness of their tool for broad-based analyses of the issues in Indonesia.

    Towards no net loss, and beyond (in the UK)

    Friday, December 31st, 2010

    I had mentioned in a previous post how the UK was discussing policies for biodiversity offsets and habitat banking.

    Conclusions from the Natural Capital Initiative‘s third workshop, which took place in early December 2010, are not yet on-line but they are discussed by Daniel Kandy of the ecosystem market place on their website.

    He argues that the workshop gave little hope for a national policy framework or strong government regulations on offsetting. A framework for voluntary offsets by developers is a more likely outcome of the current discussions, in particular under the new coalition government:

    Given the coalition government’s commitment to reducing regulation and meting out more power to local governments, a biodiversity offset program will more than likely be voluntary in nature and be regulated at the local level. After years of a Labour government opting for top-down regulatory approaches, the Conservative-Liberal Democrat coalition government has decided to move towards a less centralized form of government oversight.

    The government will state its position in the spring of 2011, in a white paper called the “Natural Environment Policy Paper”. Meanwhile, discussions continue. Stay tuned for the publication of the workshop’s conclusions by the Natural Capital Initiative themselves…

    The Tolstoy effect

    Sunday, December 12th, 2010

    In Anna Karenina, Tolstoy reminds us that “happy families are all alike” while “every unhappy family is unhappy in its own way”.

    Emilie Stander and Joan Ehrenfeld concluded that the same was true for wetlands. They studied the functioning of wetlands used as (supposedly “pristine”) reference wetlands for wetland mitigation in New Jersey (USA) and found that in this heavily urbanized setting, even reference wetlands were “unhappy”… (pdf here)

    This raises issues for using typologies of wetlands in the assessment of wetland states (as in the context of wetland mitigation in the USA).

    Identifying reference wetlands on the basis of standard structural indicators is misleading when wetlands are in heavily modified landscapes and watersheds. They suggest that instead, multi-year data on functioning should be used to create appropriate typologies of wetland functioning.

    A further step would be to use “theoretical” references for assessing wetland state but this would most likely make in-the-field assessment more difficult.