Lay Summaries Archive

Read Lay Summaries from previous volumes of Functional Ecology here:

Early View Lay Summaries

 

Linkage and trade-off in trophic morphology and behavioral performance of birds.

Clay Corbin, Lauren Lowenberger & Brandan GrayImage provided by authors.

Share this summary:

 


Google+ Linkedin

For a long time we’ve thought that increasing force during muscular action comes at a cost to the speed at which that action is accomplished. Think about it: at the gym, you can lift heavy weights slowly or light weights quickly. Also, we think there is a pretty tight relationship between the shape of anatomy and its function. An example is Darwin’s finches: large conical bills can apply lots of force (slowly) to crush large seeds, while long, shallow bills tend to close quickly. We wondered if this could be seen in a bird community comprised of bird species from diverse historical and ecological backgrounds. So, we collected three sets of measurements on birds from 18 North American species: bite-force from birds we caught in mistnets, closing velocity from videos of feeding birds, and anatomical measurements from skull specimens housed at the Carnegie Museum of Natural History. We ran some regression analyses and found that, indeed, bird species with larger beaks closed them forcefully; birds with smaller bills closed them quickly. However when regressing force and velocity directly, we were surprised to find only a weak negative relationship, not the graceful negatively sloping relationship seen in human exercise physiology journals. In our data, force is more tightly tied to overall size of the skull and beak, whereas velocity seems to be a product of shape: specifically, the ratio of in-lever length (distance from the articulation between lower jaw and skull to the attachment site of the jaw-closing musculature) to out-lever length (distance from the articulation to the distal tip of the lower jaw). We think the set of anatomical and physiological characteristics associated with high closing force may be decoupled from the set associated with quick closing velocity. Once we corrected the force data for size, the expected trade-off was revealed. However, birds like the ones in your backyard, possibly feeding on hard seeds or snatching flies from the air, can’t suddenly correct for being small (or large!) – they’d just go hungry. Likewise, it is possible that their species’ evolutionary histories include corrections for weak or slow bites, but correcting for one doesn’t necessarily come at a cost in the other.

Image caption: Image provided by authors.
This can be found here.

 

Variation in root morphology of flowering plants is linked to ancestry, but root chemistry is comparable to aboveground tissues.

Oscar J. Valverde-Barrantes, Kurt A. Smemo, and Christopher B. BlackwoodRepresentative fine root systems of Halesia tetraptera (Hal tet, asterdi), Acer saccharum (Ace, sac, rosid) and Magnolia virginiana (Mag vir, magnoliid), showing morphological differences among main angiosperm clades. Picture credits to Peter Blackwood.

Share this summary:

 


Google+ Linkedin

Recent studies have shown that the morphology and chemical composition of fine roots are surprisingly diverse in woody plants. Nevertheless, relatively few studies have attempted to explain the mechanisms behind root trait variation or how those traits are integrated at the entire plant level. For instance, it is expected that root length and tissue nitrogen content should be positively correlated, reflecting a tradeoff between root metabolic activity and surface exposure, as observed in analogous foliar tissues. Moreover, leaf and root traits should be correlated at the entire plant level, guaranteeing a coupling in metabolic activity among organs. Another hypothesis suggests that root morphology evolved from thick and scarcely branched toward thinner and highly branched, reflecting the initial dependency of roots on a symbiotic association with mycorrhizal fungi and subsequent (relative) independence from this association. In this study, we contrasted these two hypotheses by examining the chemical and morphological traits in leaves and fine roots of 34 temperate tree species from three main branches of the flowering plant family tree.

We found a correlation between morphological traits and nitrogen concentration in leaves but not in roots. Unlike leaves, species that were closely related had root traits that were more similar than expected by chance. The oldest angiosperm group (magnoliids) possessed thicker and less branched roots than later, more derived, groups (rosids and asterids). Chemically, lignin levels were higher in rosids than other groups, suggesting that trait combinations vary independently among plant groups. We found only weak correlations between root and leaf morphological traits, but a positive correlation between root and leaf nitrogen levels and other chemical traits. Our study suggests that the evolutionary forces that have shaped root morphology diverge from the tradeoffs commonly observed in leaves aboveground. However, correspondence in nitrogen levels suggests some physiological integration. Overall, our study highlights the fact that root trait patterns do not correspond with the typical patterns described for leaves. Instead, we emphasize the need to incorporate evolutionary history as an important factor explaining root traits in woody plants.

Image caption: Representative fine root systems of Halesia tetraptera (Hal tet, asterdi), Acer saccharum (Ace, sac, rosid) and Magnolia virginiana (Mag vir, magnoliid), showing morphological differences among main angiosperm clades. Picture credits to Peter Blackwood (http://www.blackwoodphoto.com/).
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Do female anole lizards retain the ability to respond to testosterone?

Christian L. Cox, Amanda F. Hanninen, Aaron M. Reedy, and Robert M. Cox Female brown anole lizard, Anolis sagrei. Photo provided by author.

Share this summary:

 


Google+ Linkedin

Although selection often favors dramatic differences between the sexes, the evolution of these differences can be constrained because males and females share a genome. In vertebrates, sex hormones can solve this problem by regulating gene expression and development in ways that are unique to each sex. Testosterone in particular is important in male development and is usually found in higher concentrations in males than in females. Nonetheless, testosterone also circulates and serves important biological functions in females. This raises the question of whether sex differences that are regulated by testosterone tend to evolve not only through the coupling of male development to this sex hormone, but also through reductions in the sensitivity of females to testosterone. By altering testosterone levels in juvenile male and female brown anole lizards, we show that females retain the ability to respond to testosterone for a variety of traits, including body size, skeletal growth, metabolism and energy storage, and colorful social signals. Our findings suggest that hormonally mediated differences between the sexes have evolved primarily by linking the development of these traits to higher levels of testosterone in males, and not by altering the way that females respond to testosterone.

Image caption: Female brown anole lizard, Anolis sagrei. Photo provided by author.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Water water everywhere: soil water content influences hatchling reptile characteristics.

Brooke L. Bodensteiner, Timothy S. Mitchell, Jeramie T. Strickland and Fredric J. JanzenFigure 1: A) Hatchling painted turtle emerging from nest. (Photo credit: TSM) B) Experimental clutch of painted turtle eggs from this experiment. (Photo credit: BLB).

Share this summary:

 


Google+ Linkedin

Developing reptile eggs are very sensitive to the environmental conditions experienced in the nest. For most reptiles, weather conditions are the most important factor determining the nest environment. Because current climate models predict increases in global temperature, biologists have carefully investigated the influence of incubation temperature on phenotypes of offspring in both the lab and field. However, these climate models also predict marked increases in extreme precipitation events. Many reptiles lay leathery, flexible-shelled eggs that are permeable to water. Thus not only soil temperature, but also soil moisture, can substantially influence phenotypes of developing offspring.

Nearly all the research on the influence of moisture on reptile eggs has occurred in the laboratory and most of this research suggests that wetter conditions produce larger offspring. These observations have not been examined in the field, where complex interactions between environmental variables exist, and moisture levels fluctuate. In this two-year field experiment, we located painted turtle nests and divided their eggs into two artificially constructed nests. One of these nests received supplemental watering throughout incubation while the other was exposed to only natural rainfall.

In 2012, our field site experienced a drought. Watered nests were cooler than control nests and produced larger hatchlings, which upholds the findings of laboratory studies. In 2013, our field site experienced more typical precipitation patterns. Watered nests did not differ from control nests in temperature, and produced smaller hatchlings. We believe that in 2013 our supplemental watering created a nest environment that was too wet and may have caused detrimental effects on the developing offspring. These differences between years suggest that our supplemental watering had context dependent effects on the nest environment and the hatchling phenotypes.

Our findings illustrate the complex interplay between environmental variables that occurs in the field, which is often unexplored in the laboratory, and confirm the importance of corroborating laboratory work with field studies.

Image caption: Figure 1: A) Hatchling painted turtle emerging from nest. (Photo credit: TSM) B) Experimental clutch of painted turtle eggs from this experiment. (Photo credit: BLB).
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Bird song properties and auditory sensitivity.

Alejandro Vélez, Megan D Gall, Jianing Fu and Jeffrey R Lucas Song sparrow (Melospiza melodia) singing. Photo by Basar, Image from Wikimedia Commons.

Share this summary:

 


Google+ Linkedin

Also see the video for this paper below!

Bird song has served as a model system for the study of the evolution of vocal communication. Several studies have uncovered factors that regulate the evolution of species-specific song types. For instance, closely related species have similar anatomical features (such as the size of the bill in songbirds) that can constrain the types of song a bird can produce. Similarly, habitat is an important factor that constrains the use of certain song properties. Birds that live in forests tend to have tonal songs with relatively low frequencies: these are properties that facilitate propagation through the understory. In contrast, birds that live in open areas tend to have songs with relatively higher frequencies because there are fewer constraints on sound propagation in open habitats.

Effective communication should result in a close match between bird song characteristics and the auditory mechanisms that facilitate signal processing, i.e. birds should be good at hearing the kinds of songs they make. We asked whether hearing sensitivity correlates with song and habitat characteristics in nine species of sparrows. We studied three species that inhabit forest areas, three species that live in intermediate, scrub-like areas, and three species that live in open areas. We measured song frequency content and hearing sensitivity over a broad range of frequencies. Based on previous studies, we predicted higher song frequencies in open-habitat species than in forest species. Based on the hypothesis that hearing correlates with song characteristics, we predicted higher sensitivity to high-frequency sounds in open-habitat species than forest species.

Consistent with previous studies, song frequency was highest in species from open habitats and lowest in forest species. Contrary to our predictions, however, song frequency and habitat were not correlated with high-frequency hearing sensitivity. Instead, overall song structure was highly correlated with high-frequency hearing sensitivity. Species with structurally complex songs (like the song sparrow) were more sensitive to high-frequency sounds than species that produce tonal songs (like the white-throated sparrow), or trilled songs (like the dark-eyed junco). These results highlight the importance of considering the different dimensions of vocal signals when we think about the evolution of sensory mechanisms that allow for signal processing.

Image caption: Song sparrow (Melospiza melodia) singing. Photo by Basar, Image from Wikimedia Commons.
This article is available here.

 

Review

The ecology of bark thickness: An hypothesis.

Juli G. PausasCorky bark of Myrcia bella (Myrtaceae) from a Brazilian savanna (cerrado).

Share this summary: Google+Linkedin

Bark is a vital and very visible part of woody plants, yet the functional and evolutionary ecology of bark is still poorly understood. Here I review one bark property: thickness. Bark thickness is very variable among woody plants and I hypothesize that fire is the most important factor selecting for a thick bark. This is because bark is a very good heat insulator and under low intensity fires, small increases in bark thickness provide a great increase in stem protection and survival. Consequently, at the global scale, most of the variability in bark thickness should be explained by variability in fire regimes. Here I provide evidence supporting the role of fire regime in shaping bark thickness (in conjunction with other plant traits) on a global scale.

Forest environments with very frequent (and low intensity) understory fires select for trees with thick bark at the base of the bole. In some savannas, trees do not have specially thick bark as they tend to growth quickly to escape the height affected by grass fires. Savanna trees living in poor soils may not be able to grow quickly and thus trees can only survive if the whole plant (including the thin branches) has a very thick bark . In Mediterranean ecosystems, fires are less frequent than in savannas, and there is time for the accumulation of fine woody biomass. Consequently, fires burn intensely and thus small differences in bark thickness do not increase stem survival; in such conditions, most species have relatively thin bark. In wet tropical forests, tree bark is very thin because fires are very rare and thus a thick bark is not advantageous. In very arid ecosystems, fuels are too sparse for fire spread, and thus thick barks might not be expected, although some thick barks may occur as a response to water stress.

In conclusion, fire regimes can explain a large proportion of the variability of bark thickness at the global scale, and thus this trait varies across ecosystems in a predictable manner.

Image caption: Corky bark of Myrcia bella (Myrtaceae) from a Brazilian savanna (cerrado).
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Some don't like it hot: southernmost populations live very close to their thermal limits.

Catarina F. Mota, Aschwin H. Engelen, Ester A. Serrão and Gareth A. Pearson Fucus vesiculosus at the base of cordgrass (Spartina maritima) in the Ria Formosa coastal lagoon. Photo courtesy of the authors.

Share this summary:

 


Google+ Linkedin

The intertidal zone comprises the vertical range of the shore exposed during the lowest low tides and covered by the highest high tides. Intertidal seaweeds are marine species occupying this interface between marine and terrestrial habitats. As major photosynthetic primary producers and structural species in intertidal ecosystems, how seaweeds cope with severe stresses like high temperature and desiccation during exposure to air is important for understanding ecosystem functioning as a whole. Growing in canopies and dense stands, seaweeds modify the physical environment for themselves and associated species, providing distinct microhabitats where thermal conditions can be very different from the prevailing air temperature.

During heat stress, cells express characteristic heat-shock (HS) proteins that repair or remove heat-damaged proteins. The temperatures for the onset, maximum and decline of this heat shock response (HSR) define the thermal range that an organism can endure. Thermal stress is expected to approach physiological limits near the southern edges of a species’ distribution. We collected 4 decades of atmospheric air temperature records, defined HSR parameters in the laboratory, and used sensors to measure the microhabitat thermal environment at local scales in a range-edge population of bladder wrack (Fucus vesiculosus) in southern Portugal, which is now locally extinct.

Laboratory experiments measuring HS gene expression and photosynthetic functioning in F. vesiculosus showed the onset of HSR below 24ºC and a maximum response at 28ºC, while at 36ºC HSR declined (and seaweeds were physiologically damaged). In the field, a mild HSR was seen even in January, and was severe for most of the year. The HSR correlated well with temperatures measured in microhabitats; seaweeds at the edge of the canopy were particularly stressed, while those living under the canopy were somewhat protected from severe HS. Remarkably, although the hottest microhabitat was the canopy surface, seaweeds here showed the lowest HSR, because rapid desiccation leads to a quiescent state that possibly protects them from the damaging effects of heat. Overall, we show that microhabitat temperatures are better predictors of thermal stress than local air temperatures for natural populations, and that these southern populations exist(ed) very close to their thermal physiological limits.

Image caption: Fucus vesiculosus at the base of cordgrass (Spartina maritima) in the Ria Formosa coastal lagoon. Photo courtesy of the authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Skin turnover increases in amphibians infected with deadly chytrid fungus.

Michel E. B. Ohmer, Rebecca L. Cramp, Craig R. White, & Craig E. Franklin An adult Australia green tree frog (Litoria caerulea). Photo: M. E. B. Ohmer.

Share this summary:

 


Google+ Linkedin

Worldwide, the chytrid fungus (Batrachochytrium dendrobatidis) is wreaking havoc on amphibian populations through the insidious disease chytridiomycosis. What makes this disease so striking is that it infects a wide range of amphibian species globally, leading to population declines and even extinction, despite infecting only the amphibian’s skin.

In amphibians, the skin is an essential organ that functions in water and ion balance and gas exchange. In order to maintain skin health and integrity, amphibians regularly shed their skin. Called sloughing, this process involves removal of the thin outer layer of skin via a series of limb, side, and mouth movements, after which the shed skin is ingested. Regular sloughing is thought to play an important role in immune defence, by removing skin-associated microbes. Crucially, while sloughing is a significant aspect of amphibian skin physiology, we know very little about its relation to the progression of chytridiomycosis.

In order to understand what makes some species more susceptible to this skin disease than others, we have taken a closer look at the unique physiology of amphibian skin shedding. With the help of infrared cameras to capture this elusive behaviour, we examined the relationship between skin sloughing and chytridiomycosis progression by exposing Australian green tree frogs (Litoria caerulea) to the fungus, and monitoring sloughing rates and infection progression over time. We found that frogs sloughed their skin every four days on a predictable cycle, but as chytrid infection intensity increased, sloughing rate also increased to every three days. Surprisingly, sloughing itself did not reduce the abundance of chytrid fungus on the skin.

Our work demonstrates that an increased sloughing rate in infected frogs does not appear to curb the progression of disease. In fact, sloughing may actually increase the detrimental effects of the fungus in terminally ill frogs by further inhibiting water and salt transport across the skin. By measuring sloughing rates directly for the first time, our results shed light on how chytrid fungus interacts with skin maintenance processes, and indicate that variation in skin sloughing frequency may play a role in the observed variation in susceptibility to disease.

Image caption: An adult Australia green tree frog (Litoria caerulea). Photo: M. E. B. Ohmer.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Plant virus infection protects host plants from herbivore attack.

Kerry E. Mauck, Erica Smyers, Consuelo M. De Moraes & Mark C. MescherCMV-infected (upper) and healthy (lower) Cucurbita pepo plants with insets of squash bugs (top) and aphids (bottom). Photos by K. Mauck.

Share this summary:

 


Google+ Linkedin

Plant viruses live within plant cells and most are spread to new host plants through feeding and movement of specific insects (vectors). Sometimes, infection has negative effects on the plant, such as reduced size or seed production. But infection can also change aspects of the host plant (its phenotype) that mediate interactions with other antagonists. Virus infection can alter host cues used by foraging insect herbivores (including vectors), such as color, smell, or taste. Some of these cues (color, smell) are also used by beneficial insects to locate herbivore prey. Despite the fact that virus infection is known to alter host phenotype, few studies have examined how virus infection changes plant interactions with non-vector herbivores and their predators.

In our study we used field experiments, behavior trials, and analysis of plant chemical and physical characteristics to understand the effects of a widespread plant virus, Cucumber mosaic virus (CMV), on the interactions of its host plant, squash, with non-vector herbivores and predators. Our previous work showed that CMV infection makes plants more attractive to their aphid vectors based on smell, but diminishes plant quality and palatability (encouraging vectors to disperse and spread the virus after picking it up from leaf cells). Here we found that CMV infection also reduced the likelihood of non-vector herbivores visiting, colonizing, and laying eggs on squash plants. Most notably, a highly damaging specialist herbivore was unable to recognize CMV-infected plants as good sites for egg laying. This reduction in herbivore levels was consistent with our analysis of plant characteristics, which showed that CMV-infected plants have lower levels of sugars and reduced size, making plants less palatable and less visually apparent. In contrast, predators and parasitoids were able to locate plants with prey regardless of whether they were infected or healthy. The combination of lower herbivore visitation with maintenance of predator visitation means that CMV-infected plants may experience reduced herbivore attack relative to healthy plants. This means that CMV-infected plants will be present in the landscape longer, possibly leading to more new infections, and that plants infected with CMV may even out-perform uninfected plants when herbivore populations are very high.

Image caption: CMV-infected (upper) and healthy (lower) Cucurbita pepo plants with insets of squash bugs (top) and aphids (bottom). Photos by K. Mauck.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Invasive plants, thermal regimes, and habitat management for snakes.

Evin T. Carter, Bryan C. Eads, Michael J. Ravesi, and Bruce A. KingsburyIntroduced Japanese honeysuckle encroaching on copperhead habitat (photo credit to Evin T. Carter).

Share this summary:

 


Google+ Linkedin

In this new era of “global ecology”, plants and animals from around the world are becoming established where they previously did not exist. In some cases, they have become invasive, potentially altering how ecosystems function. Despite recognition that devastating effects can follow, there remains a shortage of evidence regarding broadly applicable mechanisms.

Here we examine the impacts of exotic invasive plants on a snake, the Northern Copperhead (Agkistrodon contortrix), in a site facing substantial infestation by exotic plants. We hypothesized that the denser growth patterns characteristic of exotic invasive plants lead to reduced and less variable temperature below the canopy. Consequently, the quality of the habitats would be compromised for ectotherms like the copperhead in a temperate forest landscape, where access to sunlight is key to maintaining suitable body temperature and proper metabolic function.

To test our hypothesis, we used physical models of snakes to measure environmental temperatures within native and exotic plant-dominated habitats (including 11 exotic species) in a temperate forest landscape in the Midwestern US. Using temperatures derived from models and temperature preferences of snakes in the laboratory, we generated estimates of the capacity for snakes to achieve preferred body temperatures within that landscape. To further test the effect of vegetation structure and the efficacy of targeted management, we also removed exotic plant foliage from eight 20 m2 plots while monitoring use of those areas by reptiles before and after manipulations.

We found that exotic plant-dominated habitats exhibited reduced and less variable temperatures compared to their native counterparts, with mixed-exotic habitats exhibiting the lowest temperatures overall and exotic shrubs (6 species) the lowest as a structural group. Radio-tagged snakes clearly avoided exotic vegetation at multiple scales. Response to exotic foliage removal was also rapid—including use of plots as gestation and birthing sites. Our results suggest a direct effect that is common to a broad range of invasive plants in a variety of ecological contexts. Because eradication of many invasives is unlikely, we suggest that targeted thinning is a cost-effective means of partially alleviating this challenge in compromised landscapes.

Image caption: Introduced Japanese honeysuckle encroaching on copperhead habitat (photo credit to Evin T. Carter).
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Wolves that are heavily hunted have higher stress and reproductive hormones than wolves with lower hunting pressure.

Heather M. Bryan, Judit E. G. Smits, Lee Koren, Paul C. Paquet, Katherine E. Wynne-Edwards & Marco Musiani Two wolves in Alberta run across a frozen lake away from an approaching helicopter. Photo credited to Paul Paquet.

Share this summary:

 


Google+ Linkedin

In parts of the world where they remain, wolves play a key role in maintaining healthy ecosystems. Yet wolves are also viewed as competitors with people over shared prey and livestock. Consequently, wolves are often subject to management that includes population reductions of up to 50% per year, with occasional reductions up to 90% during intensive control. Wolves have a sophisticated social system with typically one breeding female per year. On-going and substantial reductions can disrupt wolf social structure, leading to increased reproductive rates and altered genetic structure and behavior. Despite these profound effects on surviving individuals, little is known about the influences of population reductions on wolf physiology.

Accordingly, we applied the novel approach of measuring hormones in hair to investigate the physiological effects of hunting on wolves subject to different hunting pressures in Northern Canada. Stress and reproductive hormones, including cortisol, progesterone, and testosterone, are accumulated into growing hair from the blood vessel that supplies the hair follicle and surrounding glands. Our measurements of hair revealed that wolves from a heavily hunted population in the tundra-taiga had higher progesterone compared with lightly hunted wolves in the boreal forest. Elevated progesterone could reflect a higher proportion of breeding females, which might occur if normal pack structure is disrupted by hunting. Similarly, heavily hunted wolves had elevated testosterone and cortisol (a stress hormone), which may reflect social instability caused by hunting.

Ecological differences between the two populations could also affect hormone levels. Therefore, we controlled for habitat by examining another population of wolves from the boreal forest killed during an intensive control program. Lack of sex data precluded examining testosterone and progesterone in this population; however, we found that similarly to tundra-taiga wolves, forest wolves from the control program had elevated cortisol compared with forest wolves that were lightly hunted. Although the long-term effects of chronically elevated stress and reproductive hormones are unknown, there are potential implications for wildlife health, welfare, long-term survival, and behaviour. Therefore, our findings emphasize that conservation and management plans should consider not only numeric responses, but also possible social and physiological effects of control programs on wildlife.

Image caption: Two wolves in Alberta run across a frozen lake away from an approaching helicopter. Photo credited to Paul Paquet.
This article is available here.

 

Starling males show their ability to cope against bacteria to females.

Magdalena Ruiz-Rodríguez, Gustavo Tomás, David Martín-Gálvez, Cristina Ruiz-Castellano and Juan J. SolerThe picture shows the ornamental feathers of the starling male. Credits: J. J. Soler.

Share this summary:

 


Google+ Linkedin

Spotless starling males have special, ornamental feathers in their throat that are longer and narrower than the rest of their plumage, and also than female throat feathers. During the courtship period, males sing in highly visible places, with the head and bill raised. This position exposes the special throat feathers, which are very conspicuous and moved by the wind. It was previously shown that those males with larger throat feathers have also a better reproductive success, since they are preferred by females.

We have found that these special feathers have a different susceptibility to degradation by feather-degrading bacteria than the other male and female throat feathers. The basal part of these feathers is highly resistant to bacteria, and consequently very few feather-degrading bacteria were found in these ornamental feathers compared to the apical part, the most exposed one; however, in non-ornamental feathers of both sexes, feather-degrading bacteria were equally found in both basal and apical feather parts.

All bird species have a gland located dorsally at the base of the tail that produces a sebaceous secretion with antimicrobial activity. Birds take this secretion with the bill and spread it on feathers to protect them against microbial pathogens. We found that those starlings with larger glands produce more secretion, and this secretion has moreover more capacity to inhibit bacterial growth; in addition, they had lower bacterial load in their feathers, and feathers were thus less degraded. Therefore females, through the evaluation of degradation status of male ornamental feathers, can estimate their capacity to fight against bacteria that may damage their feathers, which indirectly reflects the quality of males to properly maintain their hygiene and their immune system capacity.

Image caption: The picture shows the ornamental feathers of the starling male. Credits: J. J. Soler.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Oxidative costs of personality.

Kathryn E. Arnold, Katherine A. Herborn, Aileen Adam and Lucille AlexanderNovel objects presented to blue tits. Photo provided by authors.

Share this summary:

 


Google+ Linkedin

Animals differ in their personalities. Some individuals are bolder, more social, more exploratory and/or more aggressive than others. These ‘personality traits’ are behavioural differences between individuals, that are stable within individuals. But why do animals, including humans, show such variation in personality? It appears that different combinations of personality traits can correlate with survival or reproductive success, but scientists do not fully understand how or why. There is the suggestion that personality reflects variation in physiology, particularly how the body defends itself against free radicals. Free radicals are unstable molecules that can cause damage to the body and are produced continuously by the body as part of living. In this project we investigated the relationships between personality and free radical damage in wild blue tits. Birds were caught in a wood in Scotland over the winter and briefly brought into captivity to measure their personality traits. We recorded how good they were at exploring a novel cage. We also determined how bold or shy they were by measuring their responses to unfamiliar objects, namely colourful plastic toys (see photo). We found that blue tits that were both shy and good explorers possessed poor defences against free radical damage. Conversely birds that were bold and poor explorers had high defences against damage from free radicals. Thus, personality types differed in their defences against free radicals, and it was the combination of an individual’s personality traits that proved important. But what does this mean in the wild? Well, when food is scarce a good explorer might benefit from finding new foraging areas and avoiding starvation, even if it suffers some tissue damage because its defence mechanisms are weak. However, when food is abundant poor explorers with good defences against free radicals are predicted to live longer. Thus, the costs and benefits of having different personality traits will all depend on the environment. In nature there is no ‘perfect’ personality, because the environment is constantly changing.

Image caption: Novel objects presented to blue tits. Photo provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

More for less: sampling strategies of plant functional traits across local environmental gradients.

Carlos P. Carmona, Cristina Rota, Francisco M. Azcárate and Begoña PecoDetail of one of the sampling units, sited in a Mediterranean grassland near Madrid (Spain). Photograph by: Cristina Rota.

Share this summary:

 


Google+ Linkedin

Approaches based on the traits of plants (e.g. heights or leaf areas) have allowed ecologists to tackle questions regarding the changes in assembly rules of plant communities across environmental gradients or the effects of plants on ecosystem functioning. This knowledge is essential to predict the consequences of different global change scenarios, and to restore degraded ecosystems and the services they provide. One of the most critical steps in these studies is to scale-up from traits measured in individual plants to the community level, a step that is usually accomplished by calculating community indices like the average and diversity of the trait values of the species in the community. Evidently, the reliability of these indices depends on accurate measurements of the traits of the individuals that compose each sampling unit. Unfortunately, it is generally unfeasible to measure the traits of all these individuals; instead, the most common practice is to sample a reduced number of individuals of each species and average the trait values of this sample, assigning that average to all the individuals of the species. There are, however, several alternatives regarding the identity, number and location of the sampled individuals.

We performed a very intensive sampling across an environmental gradient, which allowed us to accurately estimate the 'real' values of the community indices and their changes along the gradient. Subsequently, we simulated and compared several less intensive sampling strategies that have been used in previous studies. We found that the strategy that best estimated the 'real' values was the one in which local individuals of the species (those present in each sampling unit) were used to calculate a different average trait value for each sampling unit and species (LOC). Conversely, strategies considering a single average trait value for all the sampling units in which the species is present performed much poorly, regardless of the number of individuals used to calculate the average. Although this may appear to be bad news (more work) for ecologists, we show that the accurate results yielded by the LOC strategy can be attained without increasing the total number of individuals measured across the gradient.

Image caption: Detail of one of the sampling units, sited in a Mediterranean grassland near Madrid (Spain). Photograph by: Cristina Rota.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Young trees shape the capacity of soil microbial communities to resist and recover following drought.

David Rivest, Alain Paquette, Bill Shipley, Peter B. Reich and Christian MessierImage courtesy of Alain Paquette.

Share this summary:

 


Google+ Linkedin

Soil microbial communities (i.e. bacteria and fungi) are key players in soil functions such as nutrient cycling, plant productivity and climate regulation and are fundamental for the integrity of terrestrial ecosystems following disturbance. Resistant soil communities remain essentially unchanged when subject to disturbances, such as severe drought. Resilient soil communities have a high recovery capacity after disturbance. Soil microbial communities that are resistant or resilient to drought are desirable for sustainable soil use and management, as they tend to maintain soil functions. Drought may be a major threat to tree production and its occurrence and severity are predicted to increase in many regions of the world over the next several decades. Therefore, it is important to understand how soil microorganisms respond during and after severe droughts in tree communities, and the factors explaining this response, such as tree functional traits (i.e., features related to different functions such as growth or nutrition) and soil properties.

In this study, we compared soil biochemical properties and microbial response following drought in three different 4-year-old tree monocultures and two two-species combinations that were planted in a high-density tree experiment located in southern Québec (Canada). We selected different North American temperate tree species that have contrasting functional traits. We expected that differences in functional traits between tree species would be reflected in divergent soil biochemical properties, and that these differences, in turn, would drive soil microbial resistance and resilience to drought.

We showed that tree monocultures and species mixtures influenced soil chemistry, soil biological properties and microbial resistance and resilience. Both synergistic and antagonistic effects of tree mixtures on different soil properties were found. Tree monocultures surpassed species mixtures as a key driver of resistance and resilience of soil microbial communities to drought. Interestingly, soil microbial resistance was higher in tamarack than in sugar maple monocultures. Conversely, resilience was higher under sugar maple than under tamarack monocultures. We found evidence that differences in a few leaf litter traits between tree species are sufficient to rapidly alter soil nutrient availability, especially nitrogen, which can in turn have important consequences for soil microbial resistance and resilience to drought.

Image caption: Image courtesy of Alain Paquette.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Sleeping single in a double bed: for tropical hibernators sociality is a hurdle rather than advantage.

Kathrin H. Dausmann & Julian GlosTHE MALAGASY PRIMATE CHEIROGALEUS MEDIUS FACE-TO-FACE WITH THE REPTILE URUPLATUS HENKELI, WITH WHOM IT SHARES BODY TEMPERATURE PROFILES DURING HIBERNATION.

Share this summary: Google+Linkedin

Doesn’t it sound nice to cuddle close to a fellow conspecific when it is cool and unfriendly outside? The Malagasy fat-tailed dwarf lemur (Cheirogaleus medius) does not think so. This small primate is usually a very social creature, living in tight family groups where pairs only separate when one partner dies. However, this changes when winter approaches. Madagascar is a large, tropical island off the east coast of Africa and home to – and famous for – 103 species of lemurs, the highest percentage of indigenous primates anywhere on earth. It might not be the kind of place one would expect to find hibernating mammals, let alone primates. Yet, this is just what the small-bodied dwarf lemurs do. They resort to the same strategy used by marmots, ground squirrels and dormice to survive the cold northern winters – spending at least seven months of the year hibernating through harsh times brought on by seasonal drought during the Malagasy winter. As its name suggests, most of the energy used to fuel life during hibernation in the fat-tailed dwarf lemur is stored in the tail, and it fattens up to almost double its body mass before the hibernation season. Usually, mammals profit from huddling in groups because this reduces heat loss during cool conditions, and thus decreases the need to activate internal heat production, helping them save energy. However, in the fat-tailed dwarf lemur body temperature during hibernation changes passively with external conditions in a reptile-like fashion, instead of being maintained internally. Depending on the insulation of the tree hollow used for hibernation, this can result in a daily fluctuation in body temperature of up to 20 degrees Celsius. In this practice, the dwarf lemurs prefer not to be bothered by their dear family, because in larger groups unwelcome interruptions can make the hibernation state less energy-efficient. Therefore, hibernating alone helps dwarf-lemurs to conserve their precious fat resources and thus enhance their chance of survival, leaving them in better body condition for the next reproductive period after reuniting with their mate at the end of the winter.

Image caption: The Malagasy primate Cheirogaleus medius face-to-face with the reptile Uruplatus henkeli, with whom it shares body temperature profiles during hibernation.
This paper can be found online in its As Accepted form (not typeset or proofed) here.

 

Grasshopper effect traits reflect their feeding niche and determine their impact on plant community biomass.

Hélène Deraison, Isabelle Badenhausser, Luca Börger & Nicolas GrossFemale grasshopper (Chorthippus_biguttulus). Photo provided by authors.

Share this summary: Google+Linkedin

Herbivorous arthropods, such as insects, may play an important role in regulating plant diversity and ecosystem functioning (e.g. nitrogen cycling). But it is unclear which mechanisms drive plant – arthropod interactions and ultimately arthropod effects on plant communities. Various herbivore characteristics (i.e. what we called here functional traits) have been assumed to determine herbivore impact on plant communities. For instance, herbivore body size has been proposed as a key trait determining the quantity of biomass consumed. But its effect might be modulated by herbivore food preferences (i.e. herbivore feeding niche). In this case, herbivore chemical traits (e.g. carbon:nitrogen ratio) or biomechanical traits (e.g. mandibular traits; biting strength) have been hypothesised to be related to herbivore feeding niche. Yet, how functionally contrasted herbivores may impact plant community biomass in real field conditions, and what is the relative importance of different herbivore traits, has never been experimentally tested.

We set up a cage experiment in a species-rich grassland and tested how grasshopper traits may explain their effect on plant biomass. Six grasshopper species were selected because they show contrasted traits and feeding niches.

Grasshopper impact ranged from 0% up to 60% depending on the species considered. By comparing the relative importance of multiple interacting grasshopper traits, biting strength appeared to be a key trait determining grasshopper feeding niche and impact on plant biomass. Importantly, we demonstrated that only two simple plant traits (C:N ratio and leaf dry matter content) well predicted grasshopper feeding niche. For instance, herbivores with strong mandibular strength preferentially chose tough leaves while herbivores with weak mandibular strength selected opposite plant attributes.

Our study provides a first experimental test of the relationship between herbivore traits and their niche, which in turn determines their impact on plant community biomass and ultimately on ecosystem functioning. It also contributes to the development of a trait-based approach in a multitrophic perspective and shows that simple traits can predict the intensity of trophic linkages and herbivore effects at the level of the entire plant community.

Image caption: Female grasshopper (Chorthippus_biguttulus). Photo provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

To survive against cannibalism: growth and developmental acceleration in pre-feeding salamander hatchlings in the presence of the conspecific hatchlings.

Osamu Kishida, Ayumi Tezuka, Akiko Ikeda, Kunio Takatsu & Hirofumi MichimaeVentral aspect of 7-days-old Hynobius retardatus salamander hatchlings reared alone (left) and with conspecifics (right).  Photo by Osamu Kishida.

Share this summary: Google+Linkedin

In many fish and amphibian species, vast numbers of embryos may hatch at the same time. In such situations, the hatchlings can be exposed to intensive cannibalistic interactions from conspecifics (members of the same species). How do hatchlings spend this vulnerable life stage?

Cannibalism success of the Japanese Ezo or Hokkaido salamander species (Hynobius retardatus) is highly dependent on the balance between the gape width of the cannibal (how wide it can open its mouth) and the head width of its prey, so fast growth in the pre-feeding stage is expected to contribute strongly to the survivorship of the salamander hatchlings in conspecific interactions. In this study, we report experimental evidence showing adaptive acceleration of growth and development in the pre-feeding hatchling stage. Ezo salamander hatchlings reared with conspecifics became larger and developed faster than those reared alone, the time to the start of feeding was shorter, and the burst swimming speed for hatchlings reared with conspecifics was faster.

Our predation trials revealed the advantages of growth and developmental acceleration in cannibalistic interactions. The hatchlings reared with conspecifics were more successful at cannibalizing small hatchlings and were also highly resistant to being cannibalized themselves by large conspecifics, compared to hatchlings reared alone. Because salamander larvae that cannibalize other individuals in their early developmental period exhibit rapid growth and metamorphose early with larger size, growth and developmental accelerations are likely key mechanisms for their life history success.

Image caption: Ventral aspect of 7-days-old Hynobius retardatus salamander hatchlings reared alone (left) and with conspecifics (right). Photo by Osamu Kishida.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

A cross-seasonal perspective on local adaptation: Metabolic plasticity mediates responses to winter in a thermal-generalist moth

Caroline M. Williams, Wesley D. Chick & Brent J. SinclairImage provided by authors.

Share this summary:

 


Google+ Linkedin

Across latitudinal and altitudinal gradients, environmental conditions vary strongly. To cope with these changing conditions, populations of organisms may be adapted to their local conditions, allowing them to survive and thrive better in their home environment than would populations from other regions. In temperate regions, this local adaptation must serve the organisms across their whole lifecycle, but characteristics that enhance survival and performance in one season may be detrimental in other seasons. Thus, to understand local adaptation we need to look at survival and performance across seasons, but most studies to date have focused only on the summer growing season. We tested for local adaptation to winter conditions in a common species of moth, Hyphantria cunea, which occurs throughout North America in diverse thermal environments. We collected larvae from the northern edge and centre of their geographic range, exposed them to both northern and central winter conditions in the lab, and monitored their survival and performance throughout the winter and into the next spring. We found that indeed the populations were locally adapted to their winter environment, with higher rates of survival and larger size and carbohydrate reserves when overwintered at their home conditions. This suggests that climate change may disrupt populations of this moth from their optimal conditions, and that populations may suffer if winter and growing season temperatures become decoupled.

Image caption: Image provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Sex-specific differences in ecomorphological relationships in lizards of the genus Gallotia.

Marta Lopez-Darias, Bieke Vanhooydonck, Raphael Cornette and Anthony HerrelGallotia on Isoplexis.                   Photograph by Beneharo Rodríguez

Share this summary:

 


Google+ Linkedin
Males and females often differ from one another in ways that reflect different investment in features relevant to the fitness of each sex. Whereas females typically invest in traits related to producing offspring, males tend to invest more in features related to territory defense or male-male combat. However, how differences in morphology between the sexes affect performance traits that are important in the ecological context of an animal, such as the ability to escape predators or to eat certain food types, remains poorly understood. Here, we test whether head morphology, the ability to bite hard, and diet are similar in male and female lizards (Gallotia) from the Canary Islands. These lizards are known for their sexual dimorphism suggesting that the relationships between form and function may also differ between the sexes. We collected data on bite force and head morphology and shape for both sexes of all seven known living species on all seven islands of the archipelago. Moreover we collected diet data for five out of the seven species. Our results show that the evolution of head morphology is associated with the evolution of the ability to bite hard in both sexes. However, only in females was the ability to bite hard associated with the evolution of diet, with females with higher bite forces including larger amounts of plant matter in their diet. In males, on the other hand, head morphology and bite force are not related to diet. Moreover, males with high bite forces have a wide snout suggesting that head shape and bite force may be evolving principally in relation to male-male combat in males. Our data thus suggest that head morphology and associated functional traits such as biting may evolve differently in males and females.

 Image caption: Gallotia on Isoplexis. Photograph by  Beneharo Rodríguez, website: www.gohnic.org
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

How lizards evolved a fossorial syndrome within the Brazilian Caatingas.

Agustín Camacho, Rodrigo Pavão, Camila Nascimento Moreira, Ana Carolina B.C. Fonseca Pinto , Carlos Navas & Miguel Trefaut Rodrigues Photo provided by authors.

Share this summary:

 


Google+ Linkedin

Among the reptile order Squamata (lizards and snakes), the loss of limbs to give a snake-like morphology is likely the most dramatic evolutionary change that has occurred. It is often associated with the acquisition of an underground, burrowing life-style, nocturnality and a preference for relatively low temperatures. Nonetheless, how such an interesting evolutionary transition took place remains poorly understood. We examined this process in ten, closely-related species of gymnophthalmid lizards (spectacled lizards) from the Brazilian Caatinga (desert scrubland), representing one full transition from typical lizard species to burrowing snake-like ones. Some of the species studied have typical lizard morphology, while others have a burrowing, snake-like morphology. Species of both forms live together in sandy soil regions of the Brazilian Caatingas and burrow to some extent. We used automatic temperature data loggers and X-ray images to study evolutionary relationships between morphology, burrowing performance, exposure to extreme temperatures and the evolution of thermal physiology in those lizards. Our results show that the evolution of a snake-like morphology allows a better burrowing performance in our studied species. An improved burrowing performance allows those species to reach thermally safe (cooler) areas and also seems to favour the evolution of lower preferred temperatures. At our study sites, snake-like lizards not only can avoid diurnal extreme temperatures at the soil’s surface, but also access their preferred temperatures within the sand until late night. In addition, we found that snake-like lizards active at cool hours of the day have lower critical thermal limits. Using the obtained evidence, we propose a sequential explanation for the evolution of the snake-like, burrowing syndrome in lizards that can be tested in other lineages.

Image caption: Photo provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Dependence of diverse consumers on detritus in a tropical rainforest food web as revealed by radiocarbon analysis.

Fujio Hyodo, Takashi Matsumoto, Yoko Takematsu and Takao Itioka Aerial view of a tropical rainforest, Lambir Hills National park, Sarawak, Malaysia. Photo by Fujio Hyodo.

Share this summary: Google+Linkedin

Food webs represent trophic relationships among various consumer organisms, i.e. who eats what. They are often classified into two types: plant-based food webs, starting with living plants as basal resources, and detritus-based food webs, which begin with dead organic matter (detritus). Although the two food webs have been studied separately, recent studies suggest that the coupling of the two food webs by generalist predators plays an important role in terrestrial ecosystem functioning and stability. For example, increased input of detritus could increase the abundance of generalist predators, which would lead to control of herbivory. Despite the importance of the energy and material flows from belowground, however, it remains unclear how commonly generalist predators depend on detritivores, particularly in terrestrial ecosystems.

We estimated ‘diet ages’ of diverse consumers in a tropical rainforest by measuring their radiocarbon concentration. ‘Diet age’ is the lag time between primary production and its utilisation by consumers. Radiocarbon increased after atmospheric nuclear bomb testing during the cold war and has been decreasing through mixing with ocean and biosphere since the early 1960’s, so the known level of atmospheric radio carbon can be used to estimate diet age. Our results show that herbivores, such as butterflies and bees, had diet ages 0–1 year, whereas detritivores, such as termites, had older ages of 6–>50 years. Generalist predators, such as army ants and treeshrews, had intermediate ages of 2–8 years. Given the known feeding habits of generalist predators, these intermediate ages indicate that generalist predators couple the energy and material flows from plant-based and detritus-based food webs. Further, our results demonstrate the time frame in which energy and materials flows occur through a tropical rainforest food web. Knowing this time frame would be helpful for the conservation and management of ecosystems.

Image caption: Aerial view of a tropical rainforest, Lambir Hills National park, Sarawak, Malaysia. Photo by Fujio Hyodo.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Inbred Host Plants Promote Enhanced Insect Growth and Flight Capacity.

Scott L. Portman, Rupesh R. Kariyat, Michelle A. Johnston, Andrew G. Stephenson & James H. Marden Manduca adult.

Share this summary: Google+Linkedin

Insects use flight to evade predators, locate mates, and colonize new habitat; thus, improved flight capability has the potential to increase an adult insect’s survival, reproductive success, and geographic distribution. Plant tissues, consumed by larvae (caterpillars), will later provide nutrients the adult insects ultimately need to develop their flight muscles. Most studies investigating the influence of host plants on insect herbivores only look at effects on caterpillars. However, this approach overlooks nutritional effects on the adults and the important contributions the adults make to the size and distribution of the insect’s population. Here we examine how differences in the quality of horsenettle (Solanum carolinense) host plants affect flight muscle development and flight muscle function of one of its natural herbivores, tobacco hornworm moths (Manduca sexta).

We used inbreeding as a mechanism to produce variation in host plant quality. Inbreeding in horsenettle is known to reduce the plant’s ability to defend itself against herbivores and pathogens. In both field and laboratory conditions, tobacco hornworm caterpillars prefer to feed on inbred plants compared to outbred plants, suggesting fitness advantages from eating weakly defended inbred plants as opposed to better defended outbred plants. We found caterpillars that ate inbred plants grew faster and developed into larger pupae (chrysalises) compared to caterpillars that ate outbred plants. Growth differences in the caterpillars also impacted the adult stage (moth) of the insect. In free-flight tests, moths that fed on inbred plants as caterpillars exhibited improved flight muscle metabolic function. Moreover, we found molecular evidence showing higher muscle metabolic outputs correlated with changes to the amino acid composition of a key regulatory protein in their flight muscles.

Our results show that host plant inbreeding can create effects that cascade through larval and pupal development to affect flight muscle function of the adult stage. Hence, host plant inbreeding can influence important life history traits of insect herbivores, such as mating success, survival, and dispersal. Broadly, our findings reveal that changes to the genetics of a population at one trophic level can affect the development and physiology of an animal at a higher trophic level.

Image caption: Manduca adult.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Age-related deterioration in duckweed.

Patrick M. Barks & Robert A. LairdA deceased frond of duckweed and her last-produced offspring. Photograph by Patrick Barks..

Share this summary: Google+Linkedin

As they grow old, many organisms experience progressive bodily deterioration resulting in declining rates of survival and reproduction – a phenomenon known as ageing or senescence. From an evolutionary perspective, ageing seems inherently detrimental to fitness and yet it occurs in most species across the tree of life. Thus, ageing has long been considered something of an evolutionary paradox – it is maladaptive and yet still common.

Modern evolutionary theories of ageing have addressed this apparent paradox but still fall short of explaining the wide variation in rates and patterns of ageing that exists in nature. One potential shortcoming of modern theories of ageing is that they implicitly assume ageing can only manifest through declining rates of survival and reproduction, but not through age-related declines in the fitness of one’s offspring. If age-related declines in offspring fitness occur in nature, than our theories of ageing may need to be updated accordingly.

Previous research suggests parental-age-related declines in various offspring traits occur in many organisms, from ladybugs to aquatic plants to humans. For example, in the aquatic plant duckweed, older parents produce smaller offspring with shorter lifespans than younger parents. Size and lifespan, however, are poor measures of fitness, and so for most species, we simply do not know whether offspring fitness declines with increasing parental age.

To resolve this issue, we measured age-related changes in three important demographic rates (survival, reproduction, and offspring fitness) in common duckweed, a small aquatic plant. We isolated hundreds of plants individually in Petri dishes filled with a liquid growth medium, and observed them daily for survival and reproduction. The offspring of a subset of these plants were transferred to their own Petri dishes so that we could measure their fitness (the rate of increase in their descendants) and relate that back to the age of their parents.

We observed strong age-related declines in survival, reproduction, and importantly, offspring fitness. Thus, we suggest evolutionary theories of ageing should be updated to consider the effect of declining offspring fitness. These updated theories may help us better understand the variation in patterns of ageing observed in nature.

Image caption: A deceased frond of duckweed and her last-produced offspring. Photograph by Patrick Barks.
This paper can be found online in its As Accepted form (not typeset or proofed) here.

 

Is tropical montane forest heterogeneity promoted by a resource-driven feedback cycle?

Florian A. Werner and Jürgen HomeierContrasting forest types at the study site in the Andes of Ecuador: stunted, open ridge-crest forest (top) and tall lower slope forest near creek (bottom). These two adjacent forest types share only few tree species. Photos: Florian Werner.

Share this summary:

 


Google+ Linkedin

Separated by only few dozens of meters, ridge crests of tropical mountains often differ strikingly from neighbouring valleys in terms of structure and species composition of their forests. These contrasts are not well understood despite their importance for the maintenance of biodiversity and provision of ecosystem services such as carbon storage.

We studied tree biomass and productivity (tree growth, production of leaves), quality of fresh leaves and leaf litter (nutrient concentrations and phenolics, an important group of chemicals produced by plants to deter plant-feeding animals), levels of leaf herbivory (% leaf area loss due to animal-feeding) and decomposition of leaf litter (freshly fallen leaves) in upper (near ridge crests) and lower slope position (near creeks) in a montane forest in Ecuador.

We found that forest canopy height, production of wood and foliage, quality of fresh leaves and leaf litter, and leaf losses due to herbivory, were significantly lower on upper slopes. Likewise, soil nutrients were lower on upper slopes, where we found decaying leaf litter accumulated in thick humus layers instead of decomposing readily as on lower slopes. As shown by a decomposition experiment, leaf litter from upper slope forest decomposed more slowly than litter from lower slope forest no matter which of the forest types it was placed in.

Our results suggest that the differences we observed between slope positions ultimately result from a pronounced scarcity in plant nutrients in upper slope forest that is likely to arise from nutrient losses through down-slope fluxes. The size of the contrast between these vegetation types, however, suggests that nutrient poverty near ridges is exacerbated by a positive (self-enforcing) feedback cycle in which nutrient-poor soil favour plants that produce leaves with low nutritional value and high concentrations of phenolics to deter leaf-eating animals, since the nutrients lost in eaten leaves are very difficult to replace. Because these leaf characteristics also deter organisms that decompose leaf litter, nutrients remain locked in accumulating humus instead of being liberated by decomposition and made available to plants again. Consequently, the nutrients available to plants decline even further, favouring plants producing foliage that is ever more difficult to decompose.

Image caption: Contrasting forest types at the study site in the Andes of Ecuador: stunted, open ridge-crest forest (top) and tall lower slope forest near creek (bottom). These two adjacent forest types share only few tree species. Photos: Florian Werner.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Survival of the fattest? Not in the brown anole.

Robert M. Cox and Ryan CalsbeekA female brown anole, Anolis sagrei. Illustration by Amy Y. Zhang.

Share this summary:

 


Google+ Linkedin

Darwinian natural selection is often described as the “survival of the fittest”. However, determining which individuals are actually the fittest can be challenging, so biologists often use proxies in place of fitness. One popular proxy is body condition: the mass of an animal relative to its size or length. “Fatter” animals exhibiting higher body condition are assumed to be in a better energetic state, which is predicted to improve their chances of survival and reproduction. But is “fatter” really “fitter” in nature? By analyzing a decade of survival records for over 4600 individual brown anole lizards across seven populations in The Bahamas, we show that fatter is not fitter, at least when it comes to survival. Nor does natural selection tend to favor animals of intermediate condition, as might be expected if both skinny and obese lizards struggle to survive. Instead, natural selection favors large body size, at least in males. In fact, the only time that “fatter is fitter” seems to hold true is for the largest males in the population, who experience an extra boost in their probability of survival if they are also in high condition.

Image caption: A female brown anole, Anolis sagrei. Illustration by Amy Y. Zhang.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

A place to hide: how nooks and crannies help species survive in new environments.

Daniel Barrios-O’Neill , Jaimie T. A. Dick , Mark C. Emmerson , Anthony Ricciardi and Hugh J. MacIsaacPhoto credit: Daniel Barrios-O'Neill .

Share this summary: Google+Linkedin

These days, we humans find ourselves at the top of the food chain more often than not. Still, it’s interesting to reflect on what the world is like for the vast majority of the smaller inhabitants of the planet. For all but a few, danger abounds, and avoiding being eaten is a regular feature on the daily ‘to do’ list.

Ecologists have long observed that the structural complexity of the places animals inhabit —trees, rocks, reefs and almost anything which is physically something — is fundamentally important for the long term survival of small creatures, especially those attempting to avoid hungry predators. Although the reasons for this seem simple enough, the situation is often complicated, because while some aspects of structure can serve as obvious protection for prey, others can cause problems. For instance, while a single tree might provide camouflage and spaces too small for predators to access, it could also limit options for escape. And whilst the surface of a tree may appear smooth to a chimpanzee, to an ant it is a veritable maze of ravines.

In this study we approached the issue by focusing on a single component of structure, the availability of spaces too small for predators to access — the nooks and crannies. Our aim was to understand how small changes in the availability of nooks and crannies could influence the survival of prey. We used a successful invader of rivers and lakes in the British Isles, the Ponto-Caspian shrimp Chelicorophium curvispinum, as a prey and two larger shrimp species as predators. Our outcomes demonstrate that very small increases in available nooks and crannies can substantially increase the survival of the prey, and that the most telling positive effects on survival occur when prey are few in number. Increased survival at low numbers may allow prey to avoid localised extinction, and to colonise new areas.

These findings not only help to us understand how environmental architecture mediates the spread of invasive species, but also why the underside of that rock in your garden is crawling with creatures.

Image caption: Photo credit: Daniel Barrios-O'Neill.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Basal Metabolic Rate in Tropical Birds: The Influence of Latitude and Altitude on the "Pace of Life".

Gustavo A. Londoño, Mark A. Chappell, María del Rosario Castañed, Jill E. Jankowski, and Scott K. Robinson Overview of the Manu Gradient, photo taken from the field site at 3000 m elevation with the lowland site in the background.

For decades scientists have acknowledged latitudinal and altitudinal differences in bird life history (number and size of eggs laid, nesting behaviors, and lifespan). For example, tropical birds are thought to have a slower ‘pace of life’ – they lay fewer eggs (usually two) and generally live longer than temperate birds. These differences could result from the large variation observed among latitudes and altitudes in abiotic (e.g., temperature, rainfall) and biotic (e.g., nest predation, competition) conditions. Higher latitudes and altitudes tend to be colder with more variable temperatures, whereas low latitudes and altitudes have warmer and more stable temperatures. Biotic pressures, such as nest predation, competition and parasites, tend to be higher at low latitudes and altitudes.

Our study provides data on a basic life history trait, Basal Metabolic Rate (BMR), loosely defined as the energy expended by an animal at rest, for ~250 Peruvian bird species along a 2600-m tropical altitude gradient. We also compare our BMR data from birds in Peru with BMR data collected from > 500 bird species from other studies across temperate and tropical latitudes. We use this dataset to ask the following questions. Do substantial differences in native altitude—and hence environmental temperature—influence the BMR of tropical forest birds? Is the low BMR found in other lowland tropical birds also characteristic of the geographically distant tropical birds in Peru? Do tropical birds have lower BMR than those of temperate birds? Does BMR differ among different groups of bird species (songbirds and others)?

We found that BMR does not vary among tropical altitudes or regions, but does vary among tropical and temperate latitudes and avian orders, such that birds breeding in temperate regions and songbirds have higher BMR. Our study confirms previous reports of differences in BMR between temperate and tropical bird species, consistent with the concept of tropical birds having a slower ‘pace of life’. We found no effect of environmental temperature on BMR in Peruvian forest birds across a 2600-m altitude transect (a 12 °C temperature change), reinforcing the view that low BMR in tropical birds is mainly driven by slow life history.

Image caption: Overview of the Manu Gradient, photo taken from the field site at 3000 m elevation with the lowland site in the background.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

When it comes to water, snake foetuses have priority over mom.

Andréaz Dupoué, François Brischoux, Frédéric Angelier, Dale F. DeNardo, Christian D. Wright and Olivier LourdaisUltrasonographic picture of a developing embryo (Vipera aspis).

Conflicts between a parent and offspring are not unique to humans and may occur in nature whenever there is a limited resource that the two must share. Typically, energy has been the focus of such conflicts, but water is another vital resource that has yet to be considered in the framework of parent-offspring trade-offs or conflicts. In many dry environments, water can be quite limited during certain seasons and such times often coincide with pregnancy. However, physiological demands of pregnancy mean females require a greater amount of water during this period compared to times when the female is not pregnant. Since female snakes supply energy to their developing offspring in the form of yolk that is allocated prior to fertilization, using snakes as study organisms enabled us to examine possible intergenerational water conflicts independent of energy conflicts.

We explored the trade-off over water resources between a mother and her developing embryos in a live-bearing snake, the aspic viper. We manipulated water availability (control vs. water-deprived for 20 days) to pregnant and non-reproductive female snakes. Snakes can tolerate considerable levels of dehydration and thus our treatment was ecologically relevant and non-threatening to the general health of the snakes. We examined the effects of water deprivation on female water balance, water transfer to the embryos, and reproductive performance.

Water deprivation resulted in significant female dehydration, with more pronounced effects in pregnant compared to non-reproductive females. The impacts of water deprivation on water balance were correlated with the number of offspring, with the most fecund females being more dehydrated. In contrast, water deprivation had no effect on water transfer to the offspring or on reproductive performance. Our results demonstrate that, when water is unavailable, female water balance is compromised in favour of the developing embryos, highlighting a significant trade-off over water resource between a mother and her offspring. Whether the prioritization of the offspring is a result of a “generous” mother preferentially allocating resources to her offspring or the offspring “selfishly” taking the water from the mother remains unknown. Regardless, this work demonstrates that parent-offspring conflict over water may be a substantial hurdle during the evolution of the live-bearing reproductive mode.

Image caption: Ultrasonographic picture of a developing embryo (Vipera aspis).
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Implications of lifespan variation within a leaf cohort for evaluation of the optimal timing of leaf shedding.

Noriyuki Osada, Shimpei Oikawa, and Kaoru Kitajima Photo provided by authors.

When should a piece of equipment that is increasingly worn out and less efficient be replaced? The answer should depend on how quickly it is aging. Plants face a similar question with their leaves. Plants invest resources to produce leaves in order to obtain benefits in terms of photosynthetic production. The newest fully-expanded leaf is the most productive, with minimum wear and tear and usually created at the sunniest part of the plant crown. However, as leaves get older, they become shaded more and become less efficient in their photosynthetic productivity. When should an aging leaf be replaced? For such optimization of leaf life span (LLS), how photosynthesis declines as leaves age is a critical parameter. Many researchers have attempted to quantify this parameter, often using a method of space-for-time substitution, comparing young and old leaves at a given time. Unfortunately, this approach results in a significant underestimation of the age-related decline rate of photosynthesis, as we demonstrate with a simple simulation in our paper. Our simulation also predicts that the degree of such underestimation is greater for species that have leaf cohorts that vary a lot in LLS. This prediction is supported by an analysis of published data demonstrating that the photosynthetic capacity at the cohort mean LLS was positively correlated with the variation in LLS. This strongly suggests that the age-related decline of carbon gain may be underestimated in many previous studies that neglect within-cohort variation in LLS.

Image caption: Implications of lifespan variation within a leaf cohort for evaluation of the optimal timing of leaf shedding.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Impacts of repeated stress events on an overwintering insect.

Katie E. Marshall and Brent J. SinclairLate-instar Choristoneura fumiferana caterpillar.  Photo courtesy of Jerald E. Dewey, USDA Forest Service, United States, downloaded from Wikimedia Commons.

Organisms live in complex worlds where environmental stresses can be more or less intense, occur for longer or shorter periods, and repeat more or less frequently. And although we know a lot about the effects of a single stress event, the effects of these more complex patterns of stress are not well understood.

We tested the impacts of each of these different aspects of stress simultaneously in the eastern spruce budworm Choristoneura fumiferana to attempt to identify which types of stress cause the largest impact on physiology and long-term fitness. We subjected overwintering spruce budworm larvae to four different low temperatures either once or multiple times, with three different periods of time between each exposure. We then measured short-term effects on physiology by looking at changes in freezing point and stores of carbohydrate and cryoprotectants (chemicals that protect the larva’s tissue from freezing damage). We also allowed a subset to leave diapause (winter dormancy) and feed, then recorded survival, development time, and adult body size.

We found that in all cases the greatest impacts were due to the number of exposures and the time of year the cold exposure occurred. While there were long-term effects of cold exposure during overwintering on survival, there was no impact on adult body size or development time. This suggests that current understanding of the way stress impacts individual organisms may be missing the effects of stress variability.

Image caption: Late-instar Choristoneura fumiferana caterpillar. Photo courtesy of Jerald E. Dewey, USDA Forest Service, United States, downloaded from Wikimedia Commons.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Bolder lizards discard their tails to compensate for risky behavior when food is abundant.

Chi-Yun Kuo, Duncan J. Irschick and Simon P. LailvauxA male brown anole lizard displaying its colorful dewlap. Photograph by Duncan J. Irschick.

Share this summary: Share this summary:

 


Google+Linkedin
Traits that confer benefits often have costs. A fascinating phenomenon in evolutionary ecology is that individuals can offset the costs of a beneficial trait with the function of another trait (trait compensation). The most common examples of trait compensation are those between behavioral and morphological defense mechanisms, in which individuals that are morphologically more vulnerable tend to show a higher degree of predator avoidance, and vice versa. Despite numerous reports of compensation between defensive traits, two fundamental questions have still not been fully addressed. First, whether trait compensation exists among similarly-aged individuals; and second, whether and how the relationship between compensatory traits would be influenced by the amount of food resource in the environment. Using juvenile brown anole lizards, we examined the relationship between boldness, or a willingness to take risks, and the propensity for employing tail autotomy, a costly defense trait in which a lizard voluntarily discards its tail (which gradually regrows over time), under low and high food availability. We expected bolder lizards to compensate for their higher risk-taking tendency by discarding their tails more readily. We also expected the relationship between the two traits to differ under different food availability, although the exact effect might be complex and difficult to predict a priori. Although lizards raised under low and high food availability on average did not differ in boldness or the propensity for tail autotomy, bolder lizards overall did discard their tails more readily. However, this compensatory effect was present only among individuals raised with abundant food, which suggested that trait compensation was a viable strategy only when lizards can obtain enough food to quickly regrow the lost tails. Our results showed trait compensation existed among similarly-aged individuals and served as a basis for compensatory effects observed at the population, species or higher levels. In addition, we demonstrated that food availability can influence the dynamics between compensatory traits without significantly changing the mean values of the traits per se.

Image caption: A male brown anole lizard displaying its colorful dewlap. Photograph by Duncan J. Irschick.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Oxidative stress in breeding northern elephant seals.

Jeffrey Sharick, Jose Vazquez-Medina, Rudy Ortiz and Daniel Crocker Photo provided by authors.

Share this summary:

 


Google+Linkedin

Life history theory suggests that the timing and amount of effort devoted to reproduction over an animal’s lifespan is shaped by natural selection to maximize lifetime reproduction. Central to this theory is the assumption that current reproduction reduces the ability of an organism to reproduce in the future, including negative impacts on survival. This cost may involve allocation of resources to reproduction that reduce energy available for maintenance and health. In some species, called ‘capital breeders’ these trade-offs are distinct as all of the energy used for breeding comes from body reserves stored prior to reproducing. Metabolism produces a variety of reactive oxygen species that can cause damage to biomolecules, and this ‘oxidative stress’ has been theorized to be a potential source of the survival costs associated with reproduction.. Since both high rates of metabolism and fasting are associated with oxidative stress, capital breeders might be especially subject to these costs.

We examined the potential for oxidative stress in breeding northern elephant seals. Since elephant seals forage in the ocean and breed on land, they are capital breeders. Males fast from food for over 100 days while maintaining high rates of metabolism for fighting and mating. Females fast for a month while making one of the most nutrient rich milks found in nature. We compared markers for oxidative stress in males and females at the beginning and end of breeding. We measured a pro-oxidant enzyme that makes reactive oxygen species, three important anti-oxidant enzymes, markers for oxidative damage to fats, protein and DNA, and a marker for inflammation in blood samples. We found that the pro-oxidant enzyme increased across breeding and the seals responded by increasing the levels of anti-oxidant enzymes. Despite this protective response, males showed evidence of oxidative damage to fats and DNA and showed increased levels of inflammation. Females showed evidence of oxidative damage to proteins. Our results provide evidence for oxidative stress as a cost of breeding in polygynous male elephant seals and weaker evidence in females. These data support the idea that oxidative stress may underlie survival impacts of reproduction in some species.

Image caption: Photo provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Swimming loss and recovery in the Atlantic silverside.

Kestrel Perez & Stephan MunchAdult Atlantic silversides collected in little neck bay, Long Island, NY USA. Photo credited to Kestrel O. Perez..

Share this summary: Google+Linkedin

For many animal species, large body size leads to higher survival and reproductive success. Because fast growth rates enable an individual to obtain a large body size sooner, it stands to reason that fast growth should be similarly beneficial. Why then, do growth rates commonly still vary in many species? In fact, most species do not grow as fast as possible, suggesting that there must be some down-sides to rapid growth. Across a wide range of animal species, there have been many documented cases of the negative effects of fast growth. One such example is the Atlantic silverside, a common marine fish, where individuals that grow rapidly have poorer swimming ability compared to slow-growing individuals of the same size. But, how long this effect lasts is currently unknown. In this study, we determined how long it takes for a fish to recover its ability to swim following a period of fast growth. We manipulated growth rate in Atlantic silversides by providing variable amounts of food, either unlimited food for the fast growth treatment or limited food for the slow growth treatment. After two weeks of these feed rations we then fed both treatments limited rations to maintain slow growth rates. We monitored swimming ability over this period. Full recovery from the effects of earlier fast growth would be indicated if swimming ability of the fish that had previously been growing fast, but were currently growing slowly, improved and became comparable to swimming ability of the fish that had always been growing slowly. Interestingly, we found that fish that had grown rapidly early in life not only had significantly poorer swimming ability, but continued to show the effects of this early period of rapid growth. We found that fish fully recovered normal swimming ability only after 1 month of growing slowly. Most surprisingly, swimming ability actually decreased before it improved.

Image caption: Adult Atlantic silversides collected in little neck bay, Long Island, NY USA. Photo credited to Kestrel O. Perez..
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Community phylogenetics and ecosystem functioning

Clarifying the discussion of how environmental variation shapes community diversity.

Nathan J. B. Kraft, Peter B. Adler, Oscar Godoy, Emily James, Steve Fuller & Jonathan M. LevineIn the Central Valley of California, USA, small depressions in the ground can fill with rainwater in the winter, creating a vernal pool habitat. The dominant plant species (shown here in bloom) that are found in these pools can tolerate immersion in the rainwaters, while the flooded conditions exclude many of the surrounding grassland species. This process of environmental filtering, seen here at a small scale, shapes patterns of biodiversity across the planet. Credit: Nathan Kraft.

Share this summary:

 


Google+ Linkedin

Variation in environmental factors such as temperature, rainfall and soil chemistry have profound effects on the distribution of biodiversity across the planet. Community ecologists often use the concept of "environmental filtering" to describe situations when a species is unable to survive at a site because of the environmental conditions. However, the evidence that ecologists have used to test for environmental filtering in the past is often indirect, as it is not always sufficient to rule out other potential causes, such as competition with other species. This uncertainty is particularly problematic if we try to use results from these studies to make predictions about how global change will impact species and communities.

In this study we describe a conceptual framework to help distinguish environmental filtering from other sources of variation in community structure. Strong evidence for environmental filtering comes from showing that species have the potential to arrive at a site (as a dispersed seed or as a migrant from a nearby area, for example) but are unable to survive in the environmental conditions found there. We reviewed the ecological literature to assess how environmental filtering is typically assessed, and despite the widespread use of the concept, only 15% of published studies included this direct evidence. Most studies instead rely on patterns such as species changes in abundance or changes in physiological characteristics across an environmental gradient, patterns that can be driven by other factors.

We discuss a number of ways in which both experimental and observational studies can be improved to give a more precise accounting of the role of abiotic variation in shaping community structure. By addressing these issues, ecologists can come to a clearer understanding of the multitude of ways in which environmental variation shapes patterns of diversity across communities.

Image caption: In the Central Valley of California, USA, small depressions in the ground can fill with rainwater in the winter, creating a vernal pool habitat. The dominant plant species (shown here in bloom) that are found in these pools can tolerate immersion in the rainwaters, while the flooded conditions exclude many of the surrounding grassland species. This process of environmental filtering, seen here at a small scale, shapes patterns of biodiversity across the planet. Photo credited to Nathan Kraft.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Changing drivers of species dominance during tropical forest succession.

Madelon Lohbeck, Lourens Poorter, Miguel Martínez-Ramos, Jorge Rodriguez-Velázquez, Michiel van Breugel & Frans BongersThe study area in Chiapas, Mexico, where the landscape consists of a mosaic of agricultural fields, young secondary forest and old secondary forest.

Share this summary: Google+Linkedin

Tropical forests are celebrated for their high aboveground biomass and high tree diversity. Here we study secondary succession: the process of forest recovery after complete clearance of the vegetation for agriculture. This represents a natural gradient of biomass and diversity build-up. As the forest grows back over time, some of the species that are present in the forest manage to attain high biomass and become dominant, whereas other tree species remain rare. We ask whether such dominance is related to the characteristics of the species (functional traits) and what mechanisms drive species dominance. Is it environmental filtering, i.e. does the environment select for specific types of trees? Or is it limiting similarity, i.e. successful species tend to be specialists that differ from other dominants?. We answer these questions by studying tropical secondary forest in Chiapas, Mexico.

We found that in young forests with low overall biomass the trees that are dominant, even if they are from different species, all have similar light capture strategies. Thus at this stage the main mechanism explaining dominance is environmental filtering: only species with a specific strategy are best adapted to the prevailing (high light) conditions and will dominate the young forest. As the forest gets older, biomass increases and a dense canopy prevents sunlight from entering the understory. The fierce competition for light means that trees need to specialize to make optimal use of different light-niches to be able to thrive here. Now dominant species need to be different from each other in terms of their light-capture traits, a mechanism known as competitively-driven limiting similarity. By exhibiting different strategies many species are able to co-exist in an environment that is increasingly packed by trees and limited in resources such as light.

During the first 25 years after agricultural abandonment the importance of environmental filtering as a driving force fades away rapidly and the importance of light gradient partitioning for species dominance starts to emerge. Understanding what factors shape species dominance is relevant as mainly the large dominating trees in an ecosystem determine how the forest functions.


The article is available here.

Search the Site

Search

Site Adverts

 
 
 
Virtual Issue on Ecophysiological forecasting: predicting adaptation and limits to adaptation