Extinction risk scales better to generations than to years
Contents
Abstract
It is critical to search for, and to apply, robust generalizations in conservation biology as species-specific data on endangered species are often limited. While generalizations are common in conservation genetics, where processes are treated on the scale of generations, the unique population dynamics of species are often stressed in ecology and conservation management. Is the apparent uniqueness of population attributes partly an artefact of measurement scale? One facet of this debate is the question of whether extinction risk scales better to years or to generations. To resolve this issue, the extinction risk of 100 well-studied vertebrate taxa was estimated using stochastic computer projections and analyses conducted to determine whether risk related better to years or generations. Relative strengths of evidence for alternative hypotheses were assessed using information theory. Extinction risk, assessed as the population size required for a 90% probability of persistence for 100 years, was strongly related to generation length. Conversely, when extinction risk was assessed for a fixed number of generations, there was no support for a relationship between risk and years. This finding has ramifications for assessing and reporting extinction risk because it shows that (1) crucial signals for the effective management of threatened species may not be detected when risk is measured on a scale of years alone; (2) correcting for generation length will allow data from a wider range of species to be used as defaults for species with limited data; (3) generational-scale tests of factors affecting extinction risk are more powerful than year-based ones. We recommend that extinction risk be routinely reported on a generational scale, with results on a year scale added where warranted.