Lay Summaries

The summaries below have been provided by our authors to help put their research paper into context for the wider scientific community and the general public. Lay summaries for the current issue are here. You can also find all the previous lay summaries by issue, as well as summaries for articles on Early View, in the lay summaries archive.

 

Lay summaries for the current issue

 

You can also find all the previous lay summaries by issue, as well as summaries for articles on Early View, in the lay summaries archive.

 

Priming of the decomposition of aging soil organic matter: concentration dependence and microbial control.

Johannes Rousk, Paul W. Hill, and Davey L. JonesField site: pasture in Abergwyngregyn, Gwynedd, UK.

Share this summary:

 


Google+ Linkedin

The amount of carbon stored in soil helps to regulate the global climate and soil fertility, and is the balance between formation and decomposition of soil organic matter. Decomposition of soil organic matter can be affected by high quality carbon supplements, such as sugars or vitamins, that can occur close to plant roots. Stimulation of the decomposition of soil organic matter by high quality carbon additions is termed ‘priming’, but the mechanisms for this phenomenon remain elusive. The most common explanation assigns priming to successional growth responses of groups within the microbial community that specialise in processing different forms of carbon. These specialisations include those adept at degrading complex organic materials, or high quality carbon; a division that has also been connected with fungal (complex carbon) and bacterial (high quality carbon) decomposers. Organic matter relatively freshly formed from plant carbon input to soil has also been found to be particularly sensitive to priming. We investigated how the concentration of added high quality carbon influenced priming, if the age of the soil organic matter affected priming, and if priming was related to bacterial or fungal growth responses triggered by high quality carbon additions. To create an age gradient of traceable soil carbon, we spiked a pasture soil using sugar with a radioactive marker (14C), and subsampled plots during 13 months after application. A range of concentrations of sugar was then added in subsequent laboratory experiments, and respiration, soil carbon decomposition (14C tracing), bacterial growth rates and fungal biomass were tracked. The decomposition of soil carbon aged 2-13 months showed similar concentration dependencies to added sugar, and priming increased the decomposition of soil carbon by up to 350%. We found no connection between the successional growth responses of microbial groups specialised at decomposition of different kinds of carbon (fungi or bacteria) and the priming of soil carbon decomposition. It has been suggested that enzymes that the microbial community uses to degrade soil carbon could remain active after microbial death. This could explain the lack of connection to the microbial community growth responses but clear concentration dependence on high quality carbon additions.

Image caption: Field site: pasture in Abergwyngregyn, Gwynedd, UK.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Review

The ecology of bark thickness: An hypothesis.

Juli G. PausasCorky bark of Myrcia bella (Myrtaceae) from a Brazilian savanna (cerrado).

Share this summary: Google+Linkedin

Bark is a vital and very visible part of woody plants, yet the functional and evolutionary ecology of bark is still poorly understood. Here I review one bark property: thickness. Bark thickness is very variable among woody plants and I hypothesize that fire is the most important factor selecting for a thick bark. This is because bark is a very good heat insulator and under low intensity fires, small increases in bark thickness provide a great increase in stem protection and survival. Consequently, at the global scale, most of the variability in bark thickness should be explained by variability in fire regimes. Here I provide evidence supporting the role of fire regime in shaping bark thickness (in conjunction with other plant traits) on a global scale.

Forest environments with very frequent (and low intensity) understory fires select for trees with thick bark at the base of the bole. In some savannas, trees do not have specially thick bark as they tend to growth quickly to escape the height affected by grass fires. Savanna trees living in poor soils may not be able to grow quickly and thus trees can only survive if the whole plant (including the thin branches) has a very thick bark . In Mediterranean ecosystems, fires are less frequent than in savannas, and there is time for the accumulation of fine woody biomass. Consequently, fires burn intensely and thus small differences in bark thickness do not increase stem survival; in such conditions, most species have relatively thin bark. In wet tropical forests, tree bark is very thin because fires are very rare and thus a thick bark is not advantageous. In very arid ecosystems, fuels are too sparse for fire spread, and thus thick barks might not be expected, although some thick barks may occur as a response to water stress.

In conclusion, fire regimes can explain a large proportion of the variability of bark thickness at the global scale, and thus this trait varies across ecosystems in a predictable manner.

Image caption: Corky bark of Myrcia bella (Myrtaceae) from a Brazilian savanna (cerrado).
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Some don't like it hot: southernmost populations live very close to their thermal limits.

Catarina F. Mota, Aschwin H. Engelen, Ester A. Serrão and Gareth A. Pearson Fucus vesiculosus at the base of cordgrass (Spartina maritima) in the Ria Formosa coastal lagoon. Photo courtesy of the authors.

Share this summary:

 


Google+ Linkedin

The intertidal zone comprises the vertical range of the shore exposed during the lowest low tides and covered by the highest high tides. Intertidal seaweeds are marine species occupying this interface between marine and terrestrial habitats. As major photosynthetic primary producers and structural species in intertidal ecosystems, how seaweeds cope with severe stresses like high temperature and desiccation during exposure to air is important for understanding ecosystem functioning as a whole. Growing in canopies and dense stands, seaweeds modify the physical environment for themselves and associated species, providing distinct microhabitats where thermal conditions can be very different from the prevailing air temperature.

During heat stress, cells express characteristic heat-shock (HS) proteins that repair or remove heat-damaged proteins. The temperatures for the onset, maximum and decline of this heat shock response (HSR) define the thermal range that an organism can endure. Thermal stress is expected to approach physiological limits near the southern edges of a species’ distribution. We collected 4 decades of atmospheric air temperature records, defined HSR parameters in the laboratory, and used sensors to measure the microhabitat thermal environment at local scales in a range-edge population of bladder wrack (Fucus vesiculosus) in southern Portugal, which is now locally extinct.

Laboratory experiments measuring HS gene expression and photosynthetic functioning in F. vesiculosus showed the onset of HSR below 24ºC and a maximum response at 28ºC, while at 36ºC HSR declined (and seaweeds were physiologically damaged). In the field, a mild HSR was seen even in January, and was severe for most of the year. The HSR correlated well with temperatures measured in microhabitats; seaweeds at the edge of the canopy were particularly stressed, while those living under the canopy were somewhat protected from severe HS. Remarkably, although the hottest microhabitat was the canopy surface, seaweeds here showed the lowest HSR, because rapid desiccation leads to a quiescent state that possibly protects them from the damaging effects of heat. Overall, we show that microhabitat temperatures are better predictors of thermal stress than local air temperatures for natural populations, and that these southern populations exist(ed) very close to their thermal physiological limits.

Image caption: Fucus vesiculosus at the base of cordgrass (Spartina maritima) in the Ria Formosa coastal lagoon. Photo courtesy of the authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Skin turnover increases in amphibians infected with deadly chytrid fungus.

Michel E. B. Ohmer, Rebecca L. Cramp, Craig R. White, & Craig E. Franklin An adult Australia green tree frog (Litoria caerulea). Photo: M. E. B. Ohmer.

Share this summary:

 


Google+ Linkedin

Worldwide, the chytrid fungus (Batrachochytrium dendrobatidis) is wreaking havoc on amphibian populations through the insidious disease chytridiomycosis. What makes this disease so striking is that it infects a wide range of amphibian species globally, leading to population declines and even extinction, despite infecting only the amphibian’s skin.

In amphibians, the skin is an essential organ that functions in water and ion balance and gas exchange. In order to maintain skin health and integrity, amphibians regularly shed their skin. Called sloughing, this process involves removal of the thin outer layer of skin via a series of limb, side, and mouth movements, after which the shed skin is ingested. Regular sloughing is thought to play an important role in immune defence, by removing skin-associated microbes. Crucially, while sloughing is a significant aspect of amphibian skin physiology, we know very little about its relation to the progression of chytridiomycosis.

In order to understand what makes some species more susceptible to this skin disease than others, we have taken a closer look at the unique physiology of amphibian skin shedding. With the help of infrared cameras to capture this elusive behaviour, we examined the relationship between skin sloughing and chytridiomycosis progression by exposing Australian green tree frogs (Litoria caerulea) to the fungus, and monitoring sloughing rates and infection progression over time. We found that frogs sloughed their skin every four days on a predictable cycle, but as chytrid infection intensity increased, sloughing rate also increased to every three days. Surprisingly, sloughing itself did not reduce the abundance of chytrid fungus on the skin.

Our work demonstrates that an increased sloughing rate in infected frogs does not appear to curb the progression of disease. In fact, sloughing may actually increase the detrimental effects of the fungus in terminally ill frogs by further inhibiting water and salt transport across the skin. By measuring sloughing rates directly for the first time, our results shed light on how chytrid fungus interacts with skin maintenance processes, and indicate that variation in skin sloughing frequency may play a role in the observed variation in susceptibility to disease.

Image caption: An adult Australia green tree frog (Litoria caerulea). Photo: M. E. B. Ohmer.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Plant virus infection protects host plants from herbivore attack.

Kerry E. Mauck, Erica Smyers, Consuelo M. De Moraes & Mark C. MescherCMV-infected (upper) and healthy (lower) Cucurbita pepo plants with insets of squash bugs (top) and aphids (bottom). Photos by K. Mauck.

Share this summary:

 


Google+ Linkedin

Plant viruses live within plant cells and most are spread to new host plants through feeding and movement of specific insects (vectors). Sometimes, infection has negative effects on the plant, such as reduced size or seed production. But infection can also change aspects of the host plant (its phenotype) that mediate interactions with other antagonists. Virus infection can alter host cues used by foraging insect herbivores (including vectors), such as color, smell, or taste. Some of these cues (color, smell) are also used by beneficial insects to locate herbivore prey. Despite the fact that virus infection is known to alter host phenotype, few studies have examined how virus infection changes plant interactions with non-vector herbivores and their predators.

In our study we used field experiments, behavior trials, and analysis of plant chemical and physical characteristics to understand the effects of a widespread plant virus, Cucumber mosaic virus (CMV), on the interactions of its host plant, squash, with non-vector herbivores and predators. Our previous work showed that CMV infection makes plants more attractive to their aphid vectors based on smell, but diminishes plant quality and palatability (encouraging vectors to disperse and spread the virus after picking it up from leaf cells). Here we found that CMV infection also reduced the likelihood of non-vector herbivores visiting, colonizing, and laying eggs on squash plants. Most notably, a highly damaging specialist herbivore was unable to recognize CMV-infected plants as good sites for egg laying. This reduction in herbivore levels was consistent with our analysis of plant characteristics, which showed that CMV-infected plants have lower levels of sugars and reduced size, making plants less palatable and less visually apparent. In contrast, predators and parasitoids were able to locate plants with prey regardless of whether they were infected or healthy. The combination of lower herbivore visitation with maintenance of predator visitation means that CMV-infected plants may experience reduced herbivore attack relative to healthy plants. This means that CMV-infected plants will be present in the landscape longer, possibly leading to more new infections, and that plants infected with CMV may even out-perform uninfected plants when herbivore populations are very high.

Image caption: CMV-infected (upper) and healthy (lower) Cucurbita pepo plants with insets of squash bugs (top) and aphids (bottom). Photos by K. Mauck.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Invasive plants, thermal regimes, and habitat management for snakes.

Evin T. Carter, Bryan C. Eads, Michael J. Ravesi, and Bruce A. KingsburyIntroduced Japanese honeysuckle encroaching on copperhead habitat (photo credit to Evin T. Carter).

Share this summary:

 


Google+ Linkedin

In this new era of “global ecology”, plants and animals from around the world are becoming established where they previously did not exist. In some cases, they have become invasive, potentially altering how ecosystems function. Despite recognition that devastating effects can follow, there remains a shortage of evidence regarding broadly applicable mechanisms.

Here we examine the impacts of exotic invasive plants on a snake, the Northern Copperhead (Agkistrodon contortrix), in a site facing substantial infestation by exotic plants. We hypothesized that the denser growth patterns characteristic of exotic invasive plants lead to reduced and less variable temperature below the canopy. Consequently, the quality of the habitats would be compromised for ectotherms like the copperhead in a temperate forest landscape, where access to sunlight is key to maintaining suitable body temperature and proper metabolic function.

To test our hypothesis, we used physical models of snakes to measure environmental temperatures within native and exotic plant-dominated habitats (including 11 exotic species) in a temperate forest landscape in the Midwestern US. Using temperatures derived from models and temperature preferences of snakes in the laboratory, we generated estimates of the capacity for snakes to achieve preferred body temperatures within that landscape. To further test the effect of vegetation structure and the efficacy of targeted management, we also removed exotic plant foliage from eight 20 m2 plots while monitoring use of those areas by reptiles before and after manipulations.

We found that exotic plant-dominated habitats exhibited reduced and less variable temperatures compared to their native counterparts, with mixed-exotic habitats exhibiting the lowest temperatures overall and exotic shrubs (6 species) the lowest as a structural group. Radio-tagged snakes clearly avoided exotic vegetation at multiple scales. Response to exotic foliage removal was also rapid—including use of plots as gestation and birthing sites. Our results suggest a direct effect that is common to a broad range of invasive plants in a variety of ecological contexts. Because eradication of many invasives is unlikely, we suggest that targeted thinning is a cost-effective means of partially alleviating this challenge in compromised landscapes.

Image caption: Introduced Japanese honeysuckle encroaching on copperhead habitat (photo credit to Evin T. Carter).
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Wolves that are heavily hunted have higher stress and reproductive hormones than wolves with lower hunting pressure.

Heather M. Bryan, Judit E. G. Smits, Lee Koren, Paul C. Paquet, Katherine E. Wynne-Edwards & Marco Musiani Two wolves in Alberta run across a frozen lake away from an approaching helicopter. Photo credited to Paul Paquet.

Share this summary:

 


Google+ Linkedin

In parts of the world where they remain, wolves play a key role in maintaining healthy ecosystems. Yet wolves are also viewed as competitors with people over shared prey and livestock. Consequently, wolves are often subject to management that includes population reductions of up to 50% per year, with occasional reductions up to 90% during intensive control. Wolves have a sophisticated social system with typically one breeding female per year. On-going and substantial reductions can disrupt wolf social structure, leading to increased reproductive rates and altered genetic structure and behavior. Despite these profound effects on surviving individuals, little is known about the influences of population reductions on wolf physiology.

Accordingly, we applied the novel approach of measuring hormones in hair to investigate the physiological effects of hunting on wolves subject to different hunting pressures in Northern Canada. Stress and reproductive hormones, including cortisol, progesterone, and testosterone, are accumulated into growing hair from the blood vessel that supplies the hair follicle and surrounding glands. Our measurements of hair revealed that wolves from a heavily hunted population in the tundra-taiga had higher progesterone compared with lightly hunted wolves in the boreal forest. Elevated progesterone could reflect a higher proportion of breeding females, which might occur if normal pack structure is disrupted by hunting. Similarly, heavily hunted wolves had elevated testosterone and cortisol (a stress hormone), which may reflect social instability caused by hunting.

Ecological differences between the two populations could also affect hormone levels. Therefore, we controlled for habitat by examining another population of wolves from the boreal forest killed during an intensive control program. Lack of sex data precluded examining testosterone and progesterone in this population; however, we found that similarly to tundra-taiga wolves, forest wolves from the control program had elevated cortisol compared with forest wolves that were lightly hunted. Although the long-term effects of chronically elevated stress and reproductive hormones are unknown, there are potential implications for wildlife health, welfare, long-term survival, and behaviour. Therefore, our findings emphasize that conservation and management plans should consider not only numeric responses, but also possible social and physiological effects of control programs on wildlife.

Image caption: Two wolves in Alberta run across a frozen lake away from an approaching helicopter. Photo credited to Paul Paquet.
This article is available here.

 

Starling males show their ability to cope against bacteria to females.

Magdalena Ruiz-Rodríguez, Gustavo Tomás, David Martín-Gálvez, Cristina Ruiz-Castellano and Juan J. SolerThe picture shows the ornamental feathers of the starling male. Credits: J. J. Soler.

Share this summary:

 


Google+ Linkedin

Spotless starling males have special, ornamental feathers in their throat that are longer and narrower than the rest of their plumage, and also than female throat feathers. During the courtship period, males sing in highly visible places, with the head and bill raised. This position exposes the special throat feathers, which are very conspicuous and moved by the wind. It was previously shown that those males with larger throat feathers have also a better reproductive success, since they are preferred by females.

We have found that these special feathers have a different susceptibility to degradation by feather-degrading bacteria than the other male and female throat feathers. The basal part of these feathers is highly resistant to bacteria, and consequently very few feather-degrading bacteria were found in these ornamental feathers compared to the apical part, the most exposed one; however, in non-ornamental feathers of both sexes, feather-degrading bacteria were equally found in both basal and apical feather parts.

All bird species have a gland located dorsally at the base of the tail that produces a sebaceous secretion with antimicrobial activity. Birds take this secretion with the bill and spread it on feathers to protect them against microbial pathogens. We found that those starlings with larger glands produce more secretion, and this secretion has moreover more capacity to inhibit bacterial growth; in addition, they had lower bacterial load in their feathers, and feathers were thus less degraded. Therefore females, through the evaluation of degradation status of male ornamental feathers, can estimate their capacity to fight against bacteria that may damage their feathers, which indirectly reflects the quality of males to properly maintain their hygiene and their immune system capacity.

Image caption: The picture shows the ornamental feathers of the starling male. Credits: J. J. Soler.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Oxidative costs of personality.

Kathryn E. Arnold, Katherine A. Herborn, Aileen Adam and Lucille AlexanderNovel objects presented to blue tits. Photo provided by authors.

Share this summary:

 


Google+ Linkedin

Animals differ in their personalities. Some individuals are bolder, more social, more exploratory and/or more aggressive than others. These ‘personality traits’ are behavioural differences between individuals, that are stable within individuals. But why do animals, including humans, show such variation in personality? It appears that different combinations of personality traits can correlate with survival or reproductive success, but scientists do not fully understand how or why. There is the suggestion that personality reflects variation in physiology, particularly how the body defends itself against free radicals. Free radicals are unstable molecules that can cause damage to the body and are produced continuously by the body as part of living. In this project we investigated the relationships between personality and free radical damage in wild blue tits. Birds were caught in a wood in Scotland over the winter and briefly brought into captivity to measure their personality traits. We recorded how good they were at exploring a novel cage. We also determined how bold or shy they were by measuring their responses to unfamiliar objects, namely colourful plastic toys (see photo). We found that blue tits that were both shy and good explorers possessed poor defences against free radical damage. Conversely birds that were bold and poor explorers had high defences against damage from free radicals. Thus, personality types differed in their defences against free radicals, and it was the combination of an individual’s personality traits that proved important. But what does this mean in the wild? Well, when food is scarce a good explorer might benefit from finding new foraging areas and avoiding starvation, even if it suffers some tissue damage because its defence mechanisms are weak. However, when food is abundant poor explorers with good defences against free radicals are predicted to live longer. Thus, the costs and benefits of having different personality traits will all depend on the environment. In nature there is no ‘perfect’ personality, because the environment is constantly changing.

Image caption: Novel objects presented to blue tits. Photo provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

More for less: sampling strategies of plant functional traits across local environmental gradients.

Carlos P. Carmona, Cristina Rota, Francisco M. Azcárate and Begoña PecoDetail of one of the sampling units, sited in a Mediterranean grassland near Madrid (Spain). Photograph by: Cristina Rota.

Share this summary:

 


Google+ Linkedin

Approaches based on the traits of plants (e.g. heights or leaf areas) have allowed ecologists to tackle questions regarding the changes in assembly rules of plant communities across environmental gradients or the effects of plants on ecosystem functioning. This knowledge is essential to predict the consequences of different global change scenarios, and to restore degraded ecosystems and the services they provide. One of the most critical steps in these studies is to scale-up from traits measured in individual plants to the community level, a step that is usually accomplished by calculating community indices like the average and diversity of the trait values of the species in the community. Evidently, the reliability of these indices depends on accurate measurements of the traits of the individuals that compose each sampling unit. Unfortunately, it is generally unfeasible to measure the traits of all these individuals; instead, the most common practice is to sample a reduced number of individuals of each species and average the trait values of this sample, assigning that average to all the individuals of the species. There are, however, several alternatives regarding the identity, number and location of the sampled individuals.

We performed a very intensive sampling across an environmental gradient, which allowed us to accurately estimate the 'real' values of the community indices and their changes along the gradient. Subsequently, we simulated and compared several less intensive sampling strategies that have been used in previous studies. We found that the strategy that best estimated the 'real' values was the one in which local individuals of the species (those present in each sampling unit) were used to calculate a different average trait value for each sampling unit and species (LOC). Conversely, strategies considering a single average trait value for all the sampling units in which the species is present performed much poorly, regardless of the number of individuals used to calculate the average. Although this may appear to be bad news (more work) for ecologists, we show that the accurate results yielded by the LOC strategy can be attained without increasing the total number of individuals measured across the gradient.

Image caption: Detail of one of the sampling units, sited in a Mediterranean grassland near Madrid (Spain). Photograph by: Cristina Rota.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Young trees shape the capacity of soil microbial communities to resist and recover following drought.

David Rivest, Alain Paquette, Bill Shipley, Peter B. Reich and Christian MessierImage courtesy of Alain Paquette.

Share this summary:

 


Google+ Linkedin

Soil microbial communities (i.e. bacteria and fungi) are key players in soil functions such as nutrient cycling, plant productivity and climate regulation and are fundamental for the integrity of terrestrial ecosystems following disturbance. Resistant soil communities remain essentially unchanged when subject to disturbances, such as severe drought. Resilient soil communities have a high recovery capacity after disturbance. Soil microbial communities that are resistant or resilient to drought are desirable for sustainable soil use and management, as they tend to maintain soil functions. Drought may be a major threat to tree production and its occurrence and severity are predicted to increase in many regions of the world over the next several decades. Therefore, it is important to understand how soil microorganisms respond during and after severe droughts in tree communities, and the factors explaining this response, such as tree functional traits (i.e., features related to different functions such as growth or nutrition) and soil properties.

In this study, we compared soil biochemical properties and microbial response following drought in three different 4-year-old tree monocultures and two two-species combinations that were planted in a high-density tree experiment located in southern Québec (Canada). We selected different North American temperate tree species that have contrasting functional traits. We expected that differences in functional traits between tree species would be reflected in divergent soil biochemical properties, and that these differences, in turn, would drive soil microbial resistance and resilience to drought.

We showed that tree monocultures and species mixtures influenced soil chemistry, soil biological properties and microbial resistance and resilience. Both synergistic and antagonistic effects of tree mixtures on different soil properties were found. Tree monocultures surpassed species mixtures as a key driver of resistance and resilience of soil microbial communities to drought. Interestingly, soil microbial resistance was higher in tamarack than in sugar maple monocultures. Conversely, resilience was higher under sugar maple than under tamarack monocultures. We found evidence that differences in a few leaf litter traits between tree species are sufficient to rapidly alter soil nutrient availability, especially nitrogen, which can in turn have important consequences for soil microbial resistance and resilience to drought.

Image caption: Image courtesy of Alain Paquette.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Sleeping single in a double bed: for tropical hibernators sociality is a hurdle rather than advantage.

Kathrin H. Dausmann & Julian GlosTHE MALAGASY PRIMATE CHEIROGALEUS MEDIUS FACE-TO-FACE WITH THE REPTILE URUPLATUS HENKELI, WITH WHOM IT SHARES BODY TEMPERATURE PROFILES DURING HIBERNATION.

Share this summary: Google+Linkedin

Doesn’t it sound nice to cuddle close to a fellow conspecific when it is cool and unfriendly outside? The Malagasy fat-tailed dwarf lemur (Cheirogaleus medius) does not think so. This small primate is usually a very social creature, living in tight family groups where pairs only separate when one partner dies. However, this changes when winter approaches. Madagascar is a large, tropical island off the east coast of Africa and home to – and famous for – 103 species of lemurs, the highest percentage of indigenous primates anywhere on earth. It might not be the kind of place one would expect to find hibernating mammals, let alone primates. Yet, this is just what the small-bodied dwarf lemurs do. They resort to the same strategy used by marmots, ground squirrels and dormice to survive the cold northern winters – spending at least seven months of the year hibernating through harsh times brought on by seasonal drought during the Malagasy winter. As its name suggests, most of the energy used to fuel life during hibernation in the fat-tailed dwarf lemur is stored in the tail, and it fattens up to almost double its body mass before the hibernation season. Usually, mammals profit from huddling in groups because this reduces heat loss during cool conditions, and thus decreases the need to activate internal heat production, helping them save energy. However, in the fat-tailed dwarf lemur body temperature during hibernation changes passively with external conditions in a reptile-like fashion, instead of being maintained internally. Depending on the insulation of the tree hollow used for hibernation, this can result in a daily fluctuation in body temperature of up to 20 degrees Celsius. In this practice, the dwarf lemurs prefer not to be bothered by their dear family, because in larger groups unwelcome interruptions can make the hibernation state less energy-efficient. Therefore, hibernating alone helps dwarf-lemurs to conserve their precious fat resources and thus enhance their chance of survival, leaving them in better body condition for the next reproductive period after reuniting with their mate at the end of the winter.

Image caption: The Malagasy primate Cheirogaleus medius face-to-face with the reptile Uruplatus henkeli, with whom it shares body temperature profiles during hibernation.
This paper can be found online in its As Accepted form (not typeset or proofed) here.

 

Root responses of grassland species to spatial differences in soil history.

Marloes Hendriks, Eric J.W. Visser, Isabella G.S. Visschers, Bart H.J. Aarts, Hannie de Caluwe, Annemiek E. Smit-Tiekstra, Wim H. van der Putten, Hans de Kroon & Liesje MommerLeucanthemum vulgare growing in homogeneous (Ho) and heterogeneous (He) distributions of soil biota. Photo credit: Isabella G.S. Visschers.

Share this summary:

 


Google+ Linkedin

Plants live in environments in which local conditions differ, for example the amount and type of available nutrients. If the availability of nutrients is patchy, plants can direct their roots towards patches that have highest availability. Similar patchy distributions can be found for soil biota (e.g. fungi or bacteria), but the response of plants to this type of spatial variation is not known.

Here, we investigated if rooting patterns of plants can respond to the patchiness of soil biota. We expected that plants might avoid patches with pathogenic soil biota. To test our expectation, we used four grassland species and soils on which they had been growing before, to obtain ‘own’ and ‘foreign’ soils. We created four compartments in a pot and assigned soil of each of the four species to one of the compartments (heterogeneous soil treatment). As a control situation we mixed the four soil types (homogeneous soil treatment) and in another control we removed soil biota. In order to study effects of soil biota on nutrient uptake by roots we added labelled nitrogen (15N) to the soil.

We found that most species performed better when their own and foreign soils were distributed heterogeneously, rather than when they were homogeneously mixed. The amount of roots and the nitrogen uptake rate of these roots were higher in ‘foreign’ than ‘own’ soils. When we sterilized the soil to remove soil biota, these differences disappeared, showing that indeed the soil biota caused the difference between heterogeneous and homogeneous soils.

We conclude that plants perform better if they grow in soils with patchy distribution of pathogenic soil biota, compared to when the same amount of pathogens are homogeneously mixed, because plants can selectively avoid the patches with pathogens. Plants can be disproportionately efficient in nutrient acquisition in patches without soil biota. Our results imply that diverse plant communities may be more productive than species poor vegetation, because in species rich vegetation plant species can find more patches without soil pathogens where they could maximize nutrient acquisition.

Image caption: Leucanthemum vulgare growing in homogeneous (Ho) and heterogeneous (He) distributions of soil biota. Photo credit: Isabella G.S. Visschers.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Grasshopper effect traits reflect their feeding niche and determine their impact on plant community biomass.

Hélène Deraison, Isabelle Badenhausser, Luca Börger & Nicolas GrossFemale grasshopper (Chorthippus_biguttulus). Photo provided by authors.

Share this summary: Google+Linkedin

Herbivorous arthropods, such as insects, may play an important role in regulating plant diversity and ecosystem functioning (e.g. nitrogen cycling). But it is unclear which mechanisms drive plant – arthropod interactions and ultimately arthropod effects on plant communities. Various herbivore characteristics (i.e. what we called here functional traits) have been assumed to determine herbivore impact on plant communities. For instance, herbivore body size has been proposed as a key trait determining the quantity of biomass consumed. But its effect might be modulated by herbivore food preferences (i.e. herbivore feeding niche). In this case, herbivore chemical traits (e.g. carbon:nitrogen ratio) or biomechanical traits (e.g. mandibular traits; biting strength) have been hypothesised to be related to herbivore feeding niche. Yet, how functionally contrasted herbivores may impact plant community biomass in real field conditions, and what is the relative importance of different herbivore traits, has never been experimentally tested.

We set up a cage experiment in a species-rich grassland and tested how grasshopper traits may explain their effect on plant biomass. Six grasshopper species were selected because they show contrasted traits and feeding niches.

Grasshopper impact ranged from 0% up to 60% depending on the species considered. By comparing the relative importance of multiple interacting grasshopper traits, biting strength appeared to be a key trait determining grasshopper feeding niche and impact on plant biomass. Importantly, we demonstrated that only two simple plant traits (C:N ratio and leaf dry matter content) well predicted grasshopper feeding niche. For instance, herbivores with strong mandibular strength preferentially chose tough leaves while herbivores with weak mandibular strength selected opposite plant attributes.

Our study provides a first experimental test of the relationship between herbivore traits and their niche, which in turn determines their impact on plant community biomass and ultimately on ecosystem functioning. It also contributes to the development of a trait-based approach in a multitrophic perspective and shows that simple traits can predict the intensity of trophic linkages and herbivore effects at the level of the entire plant community.

Image caption: Female grasshopper (Chorthippus_biguttulus). Photo provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

To survive against cannibalism: growth and developmental acceleration in pre-feeding salamander hatchlings in the presence of the conspecific hatchlings.

Osamu Kishida, Ayumi Tezuka, Akiko Ikeda, Kunio Takatsu & Hirofumi MichimaeVentral aspect of 7-days-old Hynobius retardatus salamander hatchlings reared alone (left) and with conspecifics (right).  Photo by Osamu Kishida.

Share this summary: Google+Linkedin

In many fish and amphibian species, vast numbers of embryos may hatch at the same time. In such situations, the hatchlings can be exposed to intensive cannibalistic interactions from conspecifics (members of the same species). How do hatchlings spend this vulnerable life stage?

Cannibalism success of the Japanese Ezo or Hokkaido salamander species (Hynobius retardatus) is highly dependent on the balance between the gape width of the cannibal (how wide it can open its mouth) and the head width of its prey, so fast growth in the pre-feeding stage is expected to contribute strongly to the survivorship of the salamander hatchlings in conspecific interactions. In this study, we report experimental evidence showing adaptive acceleration of growth and development in the pre-feeding hatchling stage. Ezo salamander hatchlings reared with conspecifics became larger and developed faster than those reared alone, the time to the start of feeding was shorter, and the burst swimming speed for hatchlings reared with conspecifics was faster.

Our predation trials revealed the advantages of growth and developmental acceleration in cannibalistic interactions. The hatchlings reared with conspecifics were more successful at cannibalizing small hatchlings and were also highly resistant to being cannibalized themselves by large conspecifics, compared to hatchlings reared alone. Because salamander larvae that cannibalize other individuals in their early developmental period exhibit rapid growth and metamorphose early with larger size, growth and developmental accelerations are likely key mechanisms for their life history success.

Image caption: Ventral aspect of 7-days-old Hynobius retardatus salamander hatchlings reared alone (left) and with conspecifics (right). Photo by Osamu Kishida.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

A cross-seasonal perspective on local adaptation: Metabolic plasticity mediates responses to winter in a thermal-generalist moth

Caroline M. Williams, Wesley D. Chick & Brent J. SinclairImage provided by authors.

Share this summary:

 


Google+ Linkedin

Across latitudinal and altitudinal gradients, environmental conditions vary strongly. To cope with these changing conditions, populations of organisms may be adapted to their local conditions, allowing them to survive and thrive better in their home environment than would populations from other regions. In temperate regions, this local adaptation must serve the organisms across their whole lifecycle, but characteristics that enhance survival and performance in one season may be detrimental in other seasons. Thus, to understand local adaptation we need to look at survival and performance across seasons, but most studies to date have focused only on the summer growing season. We tested for local adaptation to winter conditions in a common species of moth, Hyphantria cunea, which occurs throughout North America in diverse thermal environments. We collected larvae from the northern edge and centre of their geographic range, exposed them to both northern and central winter conditions in the lab, and monitored their survival and performance throughout the winter and into the next spring. We found that indeed the populations were locally adapted to their winter environment, with higher rates of survival and larger size and carbohydrate reserves when overwintered at their home conditions. This suggests that climate change may disrupt populations of this moth from their optimal conditions, and that populations may suffer if winter and growing season temperatures become decoupled.

Image caption: Image provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Sex-specific differences in ecomorphological relationships in lizards of the genus Gallotia.

Marta Lopez-Darias, Bieke Vanhooydonck, Raphael Cornette and Anthony HerrelGallotia on Isoplexis.                   Photograph by Beneharo Rodríguez

Share this summary:

 


Google+ Linkedin
Males and females often differ from one another in ways that reflect different investment in features relevant to the fitness of each sex. Whereas females typically invest in traits related to producing offspring, males tend to invest more in features related to territory defense or male-male combat. However, how differences in morphology between the sexes affect performance traits that are important in the ecological context of an animal, such as the ability to escape predators or to eat certain food types, remains poorly understood. Here, we test whether head morphology, the ability to bite hard, and diet are similar in male and female lizards (Gallotia) from the Canary Islands. These lizards are known for their sexual dimorphism suggesting that the relationships between form and function may also differ between the sexes. We collected data on bite force and head morphology and shape for both sexes of all seven known living species on all seven islands of the archipelago. Moreover we collected diet data for five out of the seven species. Our results show that the evolution of head morphology is associated with the evolution of the ability to bite hard in both sexes. However, only in females was the ability to bite hard associated with the evolution of diet, with females with higher bite forces including larger amounts of plant matter in their diet. In males, on the other hand, head morphology and bite force are not related to diet. Moreover, males with high bite forces have a wide snout suggesting that head shape and bite force may be evolving principally in relation to male-male combat in males. Our data thus suggest that head morphology and associated functional traits such as biting may evolve differently in males and females.

 Image caption: Gallotia on Isoplexis. Photograph by  Beneharo Rodríguez, website: www.gohnic.org
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

How lizards evolved a fossorial syndrome within the Brazilian Caatingas.

Agustín Camacho, Rodrigo Pavão, Camila Nascimento Moreira, Ana Carolina B.C. Fonseca Pinto , Carlos Navas & Miguel Trefaut Rodrigues Photo provided by authors.

Share this summary:

 


Google+ Linkedin

Among the reptile order Squamata (lizards and snakes), the loss of limbs to give a snake-like morphology is likely the most dramatic evolutionary change that has occurred. It is often associated with the acquisition of an underground, burrowing life-style, nocturnality and a preference for relatively low temperatures. Nonetheless, how such an interesting evolutionary transition took place remains poorly understood. We examined this process in ten, closely-related species of gymnophthalmid lizards (spectacled lizards) from the Brazilian Caatinga (desert scrubland), representing one full transition from typical lizard species to burrowing snake-like ones. Some of the species studied have typical lizard morphology, while others have a burrowing, snake-like morphology. Species of both forms live together in sandy soil regions of the Brazilian Caatingas and burrow to some extent. We used automatic temperature data loggers and X-ray images to study evolutionary relationships between morphology, burrowing performance, exposure to extreme temperatures and the evolution of thermal physiology in those lizards. Our results show that the evolution of a snake-like morphology allows a better burrowing performance in our studied species. An improved burrowing performance allows those species to reach thermally safe (cooler) areas and also seems to favour the evolution of lower preferred temperatures. At our study sites, snake-like lizards not only can avoid diurnal extreme temperatures at the soil’s surface, but also access their preferred temperatures within the sand until late night. In addition, we found that snake-like lizards active at cool hours of the day have lower critical thermal limits. Using the obtained evidence, we propose a sequential explanation for the evolution of the snake-like, burrowing syndrome in lizards that can be tested in other lineages.

Image caption: Photo provided by authors.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Dependence of diverse consumers on detritus in a tropical rainforest food web as revealed by radiocarbon analysis.

Fujio Hyodo, Takashi Matsumoto, Yoko Takematsu and Takao Itioka Aerial view of a tropical rainforest, Lambir Hills National park, Sarawak, Malaysia. Photo by Fujio Hyodo.

Share this summary: Google+Linkedin

Food webs represent trophic relationships among various consumer organisms, i.e. who eats what. They are often classified into two types: plant-based food webs, starting with living plants as basal resources, and detritus-based food webs, which begin with dead organic matter (detritus). Although the two food webs have been studied separately, recent studies suggest that the coupling of the two food webs by generalist predators plays an important role in terrestrial ecosystem functioning and stability. For example, increased input of detritus could increase the abundance of generalist predators, which would lead to control of herbivory. Despite the importance of the energy and material flows from belowground, however, it remains unclear how commonly generalist predators depend on detritivores, particularly in terrestrial ecosystems.

We estimated ‘diet ages’ of diverse consumers in a tropical rainforest by measuring their radiocarbon concentration. ‘Diet age’ is the lag time between primary production and its utilisation by consumers. Radiocarbon increased after atmospheric nuclear bomb testing during the cold war and has been decreasing through mixing with ocean and biosphere since the early 1960’s, so the known level of atmospheric radio carbon can be used to estimate diet age. Our results show that herbivores, such as butterflies and bees, had diet ages 0–1 year, whereas detritivores, such as termites, had older ages of 6–>50 years. Generalist predators, such as army ants and treeshrews, had intermediate ages of 2–8 years. Given the known feeding habits of generalist predators, these intermediate ages indicate that generalist predators couple the energy and material flows from plant-based and detritus-based food webs. Further, our results demonstrate the time frame in which energy and materials flows occur through a tropical rainforest food web. Knowing this time frame would be helpful for the conservation and management of ecosystems.

Image caption: Aerial view of a tropical rainforest, Lambir Hills National park, Sarawak, Malaysia. Photo by Fujio Hyodo.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Inbred Host Plants Promote Enhanced Insect Growth and Flight Capacity.

Scott L. Portman, Rupesh R. Kariyat, Michelle A. Johnston, Andrew G. Stephenson & James H. Marden Manduca adult.

Share this summary: Google+Linkedin

Insects use flight to evade predators, locate mates, and colonize new habitat; thus, improved flight capability has the potential to increase an adult insect’s survival, reproductive success, and geographic distribution. Plant tissues, consumed by larvae (caterpillars), will later provide nutrients the adult insects ultimately need to develop their flight muscles. Most studies investigating the influence of host plants on insect herbivores only look at effects on caterpillars. However, this approach overlooks nutritional effects on the adults and the important contributions the adults make to the size and distribution of the insect’s population. Here we examine how differences in the quality of horsenettle (Solanum carolinense) host plants affect flight muscle development and flight muscle function of one of its natural herbivores, tobacco hornworm moths (Manduca sexta).

We used inbreeding as a mechanism to produce variation in host plant quality. Inbreeding in horsenettle is known to reduce the plant’s ability to defend itself against herbivores and pathogens. In both field and laboratory conditions, tobacco hornworm caterpillars prefer to feed on inbred plants compared to outbred plants, suggesting fitness advantages from eating weakly defended inbred plants as opposed to better defended outbred plants. We found caterpillars that ate inbred plants grew faster and developed into larger pupae (chrysalises) compared to caterpillars that ate outbred plants. Growth differences in the caterpillars also impacted the adult stage (moth) of the insect. In free-flight tests, moths that fed on inbred plants as caterpillars exhibited improved flight muscle metabolic function. Moreover, we found molecular evidence showing higher muscle metabolic outputs correlated with changes to the amino acid composition of a key regulatory protein in their flight muscles.

Our results show that host plant inbreeding can create effects that cascade through larval and pupal development to affect flight muscle function of the adult stage. Hence, host plant inbreeding can influence important life history traits of insect herbivores, such as mating success, survival, and dispersal. Broadly, our findings reveal that changes to the genetics of a population at one trophic level can affect the development and physiology of an animal at a higher trophic level.

Image caption: Manduca adult.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Age-related deterioration in duckweed.

Patrick M. Barks & Robert A. LairdA deceased frond of duckweed and her last-produced offspring. Photograph by Patrick Barks..

Share this summary: Google+Linkedin

As they grow old, many organisms experience progressive bodily deterioration resulting in declining rates of survival and reproduction – a phenomenon known as ageing or senescence. From an evolutionary perspective, ageing seems inherently detrimental to fitness and yet it occurs in most species across the tree of life. Thus, ageing has long been considered something of an evolutionary paradox – it is maladaptive and yet still common.

Modern evolutionary theories of ageing have addressed this apparent paradox but still fall short of explaining the wide variation in rates and patterns of ageing that exists in nature. One potential shortcoming of modern theories of ageing is that they implicitly assume ageing can only manifest through declining rates of survival and reproduction, but not through age-related declines in the fitness of one’s offspring. If age-related declines in offspring fitness occur in nature, than our theories of ageing may need to be updated accordingly.

Previous research suggests parental-age-related declines in various offspring traits occur in many organisms, from ladybugs to aquatic plants to humans. For example, in the aquatic plant duckweed, older parents produce smaller offspring with shorter lifespans than younger parents. Size and lifespan, however, are poor measures of fitness, and so for most species, we simply do not know whether offspring fitness declines with increasing parental age.

To resolve this issue, we measured age-related changes in three important demographic rates (survival, reproduction, and offspring fitness) in common duckweed, a small aquatic plant. We isolated hundreds of plants individually in Petri dishes filled with a liquid growth medium, and observed them daily for survival and reproduction. The offspring of a subset of these plants were transferred to their own Petri dishes so that we could measure their fitness (the rate of increase in their descendants) and relate that back to the age of their parents.

We observed strong age-related declines in survival, reproduction, and importantly, offspring fitness. Thus, we suggest evolutionary theories of ageing should be updated to consider the effect of declining offspring fitness. These updated theories may help us better understand the variation in patterns of ageing observed in nature.

Image caption: A deceased frond of duckweed and her last-produced offspring. Photograph by Patrick Barks.
This paper can be found online in its As Accepted form (not typeset or proofed) here.

 

Is tropical montane forest heterogeneity promoted by a resource-driven feedback cycle?

Florian A. Werner and Jürgen HomeierContrasting forest types at the study site in the Andes of Ecuador: stunted, open ridge-crest forest (top) and tall lower slope forest near creek (bottom). These two adjacent forest types share only few tree species. Photos: Florian Werner.

Share this summary:

 


Google+ Linkedin

Separated by only few dozens of meters, ridge crests of tropical mountains often differ strikingly from neighbouring valleys in terms of structure and species composition of their forests. These contrasts are not well understood despite their importance for the maintenance of biodiversity and provision of ecosystem services such as carbon storage.

We studied tree biomass and productivity (tree growth, production of leaves), quality of fresh leaves and leaf litter (nutrient concentrations and phenolics, an important group of chemicals produced by plants to deter plant-feeding animals), levels of leaf herbivory (% leaf area loss due to animal-feeding) and decomposition of leaf litter (freshly fallen leaves) in upper (near ridge crests) and lower slope position (near creeks) in a montane forest in Ecuador.

We found that forest canopy height, production of wood and foliage, quality of fresh leaves and leaf litter, and leaf losses due to herbivory, were significantly lower on upper slopes. Likewise, soil nutrients were lower on upper slopes, where we found decaying leaf litter accumulated in thick humus layers instead of decomposing readily as on lower slopes. As shown by a decomposition experiment, leaf litter from upper slope forest decomposed more slowly than litter from lower slope forest no matter which of the forest types it was placed in.

Our results suggest that the differences we observed between slope positions ultimately result from a pronounced scarcity in plant nutrients in upper slope forest that is likely to arise from nutrient losses through down-slope fluxes. The size of the contrast between these vegetation types, however, suggests that nutrient poverty near ridges is exacerbated by a positive (self-enforcing) feedback cycle in which nutrient-poor soil favour plants that produce leaves with low nutritional value and high concentrations of phenolics to deter leaf-eating animals, since the nutrients lost in eaten leaves are very difficult to replace. Because these leaf characteristics also deter organisms that decompose leaf litter, nutrients remain locked in accumulating humus instead of being liberated by decomposition and made available to plants again. Consequently, the nutrients available to plants decline even further, favouring plants producing foliage that is ever more difficult to decompose.

Image caption: Contrasting forest types at the study site in the Andes of Ecuador: stunted, open ridge-crest forest (top) and tall lower slope forest near creek (bottom). These two adjacent forest types share only few tree species. Photos: Florian Werner.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

Survival of the fattest? Not in the brown anole.

Robert M. Cox and Ryan CalsbeekA female brown anole, Anolis sagrei. Illustration by Amy Y. Zhang.

Share this summary:

 


Google+ Linkedin

Darwinian natural selection is often described as the “survival of the fittest”. However, determining which individuals are actually the fittest can be challenging, so biologists often use proxies in place of fitness. One popular proxy is body condition: the mass of an animal relative to its size or length. “Fatter” animals exhibiting higher body condition are assumed to be in a better energetic state, which is predicted to improve their chances of survival and reproduction. But is “fatter” really “fitter” in nature? By analyzing a decade of survival records for over 4600 individual brown anole lizards across seven populations in The Bahamas, we show that fatter is not fitter, at least when it comes to survival. Nor does natural selection tend to favor animals of intermediate condition, as might be expected if both skinny and obese lizards struggle to survive. Instead, natural selection favors large body size, at least in males. In fact, the only time that “fatter is fitter” seems to hold true is for the largest males in the population, who experience an extra boost in their probability of survival if they are also in high condition.

Image caption: A female brown anole, Anolis sagrei. Illustration by Amy Y. Zhang.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

 

A place to hide: how nooks and crannies help species survive in new environments.

Daniel Barrios-O’Neill , Jaimie T. A. Dick , Mark C. Emmerson , Anthony Ricciardi and Hugh J. MacIsaacPhoto credit: Daniel Barrios-O'Neill .

Share this summary: Google+Linkedin

These days, we humans find ourselves at the top of the food chain more often than not. Still, it’s interesting to reflect on what the world is like for the vast majority of the smaller inhabitants of the planet. For all but a few, danger abounds, and avoiding being eaten is a regular feature on the daily ‘to do’ list.

Ecologists have long observed that the structural complexity of the places animals inhabit —trees, rocks, reefs and almost anything which is physically something — is fundamentally important for the long term survival of small creatures, especially those attempting to avoid hungry predators. Although the reasons for this seem simple enough, the situation is often complicated, because while some aspects of structure can serve as obvious protection for prey, others can cause problems. For instance, while a single tree might provide camouflage and spaces too small for predators to access, it could also limit options for escape. And whilst the surface of a tree may appear smooth to a chimpanzee, to an ant it is a veritable maze of ravines.

In this study we approached the issue by focusing on a single component of structure, the availability of spaces too small for predators to access — the nooks and crannies. Our aim was to understand how small changes in the availability of nooks and crannies could influence the survival of prey. We used a successful invader of rivers and lakes in the British Isles, the Ponto-Caspian shrimp Chelicorophium curvispinum, as a prey and two larger shrimp species as predators. Our outcomes demonstrate that very small increases in available nooks and crannies can substantially increase the survival of the prey, and that the most telling positive effects on survival occur when prey are few in number. Increased survival at low numbers may allow prey to avoid localised extinction, and to colonise new areas.

These findings not only help to us understand how environmental architecture mediates the spread of invasive species, but also why the underside of that rock in your garden is crawling with creatures.

Image caption: Photo credit: Daniel Barrios-O'Neill.
This article has been accepted for publication and undergone full peer review but has not been through the copyediting, typesetting, pagination and proofreading process, which may lead to differences between this version and the Version of Record. You can find the As Accepted version here.

Search the Site

Search

Site Adverts

 
Virtual Issue on Ecophysiological forecasting: predicting adaptation and limits to adaptation