Zombie Phlox, and Pseudoflowers

With “zombie fungus” currently in the public eye, thanks to the popularity of “The Last of Us”, let me introduce you to a botanical counterpart that’s easily observed in the Boise Front in spring.  As is so often the case, what nature comes up with can be more bizarre than what human minds could conceive of on their own, which is why we often turn to the natural world for inspiration in our fiction, and even our technology.

The two plants in this photo are in all likelihood different shoots of the same plant, connected by underground branches.  The one on the right is a normal early-season shoot of phlox, either prickly-leaf phlox (Phlox aculeata) or long-leaf phlox (Phlox longifolia).  The two closely related species are relatively distinct and easy to tell apart in the portions of their respective ranges that don’t overlap on one another, but a full gamut of morphological intermediates occurs in the Boise Front, and young shoots with leaves just developing are particularly problematic.

In contrast, the shoot on the left is so different that it appears to be an unrelated species, with much wider, yellowish leaves.  These differences are caused by a rust fungus that has infected this shoot, hijacked its metabolic controls, and forced the plant to grow into a form that benefits this fungus, not the plant.  I’m guessing the rust fungus in question is a species of Puccinia, though if there are any existing studies on this particular rust/flower combination, I’m not aware of them.

What intrigues me most is the similarity of the resultant morphology to a well-studied counterpart in some other genera, notably rockcresses (Boechera and Arabis).  Rustinfected shoots in these genera terminate in conspicuous yellow clusters of modified leaves that not only look enough like flowers to attract flies and other pollinators, but can even produce fragrances and sugary nectars (see “Fungus Is a Flowerlike Con Artist“).  It is probably no coincidence that these “pseudoflowers” can most often be found in early spring, when actual flowers are still in short supply, and when buttercups (Ranunculus) and other bright yellow flowers are particularly noticeable.

Fungus-infected phlox with spores ready for disperal. Pink flowers in background are on uninfected shoots.

The set-up works great for the fungus, which takes advantage of the existing flower-pollinator collaborative system for dispersing its own spores.  These spores, and usually(?) a sweet nectar, are produced in abundance in tiny cups on the undersides of the modified leaves.  Flies, which are often the most abundant early spring pollinators, seem perfectly satisfied with the pseudoflower nectar, and may even prefer it over whatever the real flowers that are in bloom have to offer.

The loser, alas, is the poor infected plant itself.  Not only is the modified shoot prevented from blooming, but even normal-looking adjacent shoots appear to be less likely to develop flowers, at least in the phlox I’ve been observing in the Boise Front.  My guess is that energy and nutrients are diverted to the greedy zombie shoots via underground stems, depleting what would otherwise be available to the rest of the plant.  It’s even possible that other spring-blooming species lose out, if pollinators are in short supply and preferentially visit the scamming pseudoflowers.

Of course, a major difference between zombies in pop culture, and zombie-esque fungi in real life, is that the latter don’t end up threatening the continued existence of the host species, either by directly killing them off or by simply disrupting their reproductive capacity.  Any fungus or other parasite that wiped out its host species would end up wiping out itself, and while we can’t be certain this never happens, it does mean that any fungus so poorly adapted would quickly disappear after a very brief existence.  In the case of phlox, although a few shoots in a population might be infected, and some individual plants weakened beyond recovery, the population itself generally continues just fine, blooming merrily away as the season progresses.  And by peak bloom, the zombie shoots have often finished their fungal reproduction and disappeared, waiting to reappear in the sequel year.

longleaf phlox (Phlox longifolia)
pricklyleaf phlox (Phlox aculeata)

The Existential Threat of Thatch

Sagebrush buttercup struggling through intermediate wheatgrass thatch.

Given that this website is based on blogging software, I thought I’d try an actual blog post, inspired by this poor struggling sagebrush buttercup (Ranunculus glaberrimus) that I photographed on this morning’s walk.  It was one of the few lonely survivors in a small isolated population along the Corrals Trail, not far north of the east end of Bob’s Trail.  This is a surprisingly popular trail for hikers as well as cyclists, given that it is nearly 10 miles to do as a loop (rather than the wimpy out-and-back from the 8th Street Road that I opted for), or so I was told by one set of hikers.

It was a fine day to be out in the central Foothills, especially after a agonizingly protracted winter.  However, while I was certainly sharing the general enjoyment of the expansive views and just being in nature that my fellow trail users were presumably relishing, I was also suffering the curse of being a botanist, incapable of closing my eyes to the biological poverty and degradation of too much of the area I was walking through.  I refer to this portion of the Foothills as “Low-Diversity Mid-Elevation Pastureland”, characterized by a dominance of non-native perennial grasses and only a smattering of the most common native wildflowers.

The reasons for this biological impoverishment began with the intense unregulated grazing by cattle and sheep in the early days of Euroamerican settlement, which both reduced the more vulnerable wildflowers and created prime conditions for the spread of non-native annual grasses such as cheatgrass (Bromus tectorum). Although grazing became moderated as the Foothills became carved up among privately owned ranches and leased areas of public lands, management of these lands was specifically as pastureland for livestock and big game animals (aka “range improvement”). Erosion control has also been a major concern, especially in the aftermath of devastating fires that repeatedly swept across the Foothills in late summer, driven by hot dry winds (and increasingly sparked by human carelessness). The default management tool for both grazing purposes and erosion control/post-fire rehabilitation has been seeding with a selection of perennial grasses that thrive in the Boise Front.  Selection of species and cultivars has been based on performance and availability; only recently have native species been promoted as the desirable option, and even then local gene pool is rarely an option (WAY too costly!)

In the Boise Foothills, one of the most commonly and widely planted grasses for range improvement and post-fire rehabilitation is intermediate wheatgrass (Thinopyrum intermedium, alternatively treated as Agropyron or Elytrigia). Although sometimes mistakenly assumed to be native, intermediate wheatgrass was introduced from Eurasia in the early 1900s and is now one of the most esteemed grasses in land management arsenals.

Unfortunately, the features that make intermediate wheatgrass superb for the intended uses can also come at the expense of native diversity, not only wildflowers but also moss, lichens, many insects, and even some smaller mammals and birds.  This is because this Eurasian species does so well that it displaces, replaces, the shrub-steppe habitat that our unique native plant, animal, fungal, and microbial diversity evolved in and is adapted to. The characteristic feature of the shrub-steppe habitat is an open scattering of shrubs, tufted bunch-grasses, and tufted forbs (i.e., wildflowers) in a matrix that we tend to dismiss as “bare ground”, but which is actually a complex micro-ecosystem of mosses, lichens, and microbes called biotic crust.  This crust provides an essential germination site and nursery for both annual and perennial wildflowers, which in turn provide critical food and other requirements for many animal species.  The biotic crust most likely plays an integral role in nutrient cycling and water retention as well, though this is still under investigation.

Enter intermediate wheatgrass, which differs from our native bunch-grasses in being an aggressively rhizomatous perennial, capable of forming a solid turf that fills in the “bare ground” and eliminates the previous matrix habitat.  Some of the more hardy native perennials, such as the buttercup in the above photo, may persist for a number of years, but the population will eventually wink out if suitable conditions for seedling germination and development no longer exist.  And the previously established mature plants might also face a shortened life span, in the face of competition for light and nutrients from such a successful invader. The story is different for species that co-evolved with turf grasses, especially in areas with summer rainfall such as the Great Plains and African savannas, but this is not that habitat, or our story.

To add insult to injury, the dried thatch of non-native grasses evidently accumulates more than that produced by native grasses, creating the conditions in the above photo.  If there is any research on why this happens, it’s not something I’m not familiar with; I can’t help but suspect, however, that our native decomposers are not well-equipped to deal with Eurasian grasses, for whatever reason. This could possibly change over time, given how evolution operates, but what we have is the current situation.

Whether there is any hope for restoring native diversity in areas already dominated by intermediate wheatgrass and other non-native grasses is a problematic question.  Unsurprisingly, the standard range-management solution to thatch build-up is . . . more grazing!  While intensive grazing will indeed reduce thatch, it comes with costs that are seldom mentioned.  For a start, while there are indeed some less palatable native species that thrive under moderate grazing pressure, others (like our only native peony, Paeonia brownii) are preferentially eaten as “cattle candy” and will quickly disappear from even moderately grazed sites. Browsers like goats can be even worse, tending to eat dried grass only after more desirable plants, including bitterbrush (Purshia tridentata) have been decimated.  Furthermore, the trampling can be devastating to the critical biotic crust, especially on steep hillsides.  The benefits ascribed to herds of large herbivores in summer-rain grassland habitats do not automatically carry over to semi-arid shrub-steppe habitats, which evolved with smaller and sparser herbivores (and without having to compete with aggressive non-natives).

So, what, if anything, can be done?  Paradoxically, my primary response is to say:  do nothing.  This comment is primarily intended to counter our deep-rooted tendency to do SOMETHING:  to continue modifying declining habitat to suit our insatiable needs, to implement drastic remediation measures that often cause more harm than simply allowing burned or otherwise impacted habitat to recover on its own, even when there is no significant threat to human lives or property.  Stop planting more intermediate wheatgrass and other non-native species, at least as the one-size-fits-all solution to just about everything. Beyond that, follow a Hippocratic oath for ecological practitioners, beginning with “Do no harm”.  Proceed cautiously and tailor treatment to individual situations, taking note of what works and what doesn’t.  Above all, pay attention to the entire “patient” (i.e., the full diversity of a unique biological community), not just the “disease”; outcomes where “the operation was a success, but the patient died” are not success stories.  Cherish the buttercups that still remain in the Foothills, and give them and the rest of our bountiful native diversity the best chance possible to continue sharing this wonderful area we all call home.