The Mycorrhizal organisation (part 1)

In Richard Powers' Pulitzer Prize-winning novel The Overstory, a botanist named Patricia Westerford makes a discovery that ruins her career. She finds that trees communicate. Through the air, they send chemical signals warning of insect attacks. Underground, vast networks of mycorrhizal fungi connect their root systems, allowing them to share nutrients, water, and information across hundreds of acres. A forest, Westerford realises, is not a collection of individual trees competing for resources. It is a single organism, thinking and coordinating through channels invisible to anyone looking only at what grows above ground.

In the novel, the scientific community rejects her work. Her papers are retracted. She loses her position. The rejection happens because her findings do not fit the way forestry has learned to see. Forests are timber. Trees are board feet. The measures that matter are the ones that show up in yield calculations. Everything else is noise.

Powers' novel is fiction, but the pattern it describes is not. It recurs wherever administrators with powerful tools attempt to improve complex systems by reducing them to what can be measured and optimised. The mycorrhizal networks, the chemical signals, the underground connections: these are precisely what gets stripped away when systems are made "legible" to central management. And the stripping away is precisely what causes those systems to fail.

The political scientist James C. Scott gave this pattern a name. In his 1998 book Seeing Like a State, Scott documented how states throughout history have simplified and standardised complex realities to make them administratively tractable. His term for this process was "legibility": the transformation of a messy, illegible world into one that can be centrally monitored and controlled.

Scott's central example was, fittingly, a forest.

In eighteenth-century Prussia, administrators developed what they called "scientific forestry." The approach was straightforward: clear the existing forest with its chaotic diversity of species, ages, and undergrowth, and replant with uniform rows of Norway spruce, all the same age, evenly spaced, easy to measure and harvest. The new forests looked magnificent on paper. Yield calculations became precise. Administrative efficiency soared.

Within a generation, the trees began to die.

Scott called this "Waldsterben": forest death. The monoculture plantations had eliminated precisely what Powers' fictional botanist would later discover: the understory vegetation, the soil-enriching processes, the pest-controlling insects, the mycorrhizal networks. Everything that didn't appear in the timber yield calculations turned out to be essential. The forest that looked optimal on the spreadsheet was actually dying.

Scott's insight was that the Prussian foresters had not merely made a technical error. They had enacted a particular way of seeing: one that made complex systems legible by stripping away the features that made those systems work. The simplified forest was easier to measure, easier to manage, easier to optimise. It was also no longer a forest in any meaningful sense.

The idea of making a complex system legible also applies to economics. For example, in the late 1950s, Soviet mathematicians believed they had solved the problem that had plagued central planning since its inception: how to coordinate millions of production decisions without the price signals that market economies use. Leonid Kantorovich (the mathematician who would later win the Nobel Prize in Economics for developing linear programming) had shown that production problems could be solved optimally through mathematical methods. If you knew your constraints and your objectives, the algorithm would find the best solution. Viktor Glushkov (the Ukrainian cybernetician who directed the Institute of Cybernetics in Kiev) proposed linking every factory, every warehouse, every transportation hub into a single information system. The economy would become a vast optimisation problem, solved in real time by machines.

Francis Spufford's Red Plenty, a brilliant hybrid of historical fiction and economic history, captures the genuine excitement of this moment. The Soviet scientists, mathematicians and engineers genuinely believed they had found a way to make central planning work. The computers would make it possible. But, the models required data…..and the data was unreliable. Factory managers had every incentive to underreport capacity and overreport needs. The optimisations assumed fixed objectives, but objectives shifted constantly as political priorities changed. Most importantly, the plans assumed that production was a mechanical process that could be specified from above. But real production depended on informal networks: the factory floor relationships, the unwritten knowledge about which machines actually worked, the favours traded between managers to get parts that the official system couldn't deliver. These networks were the Soviet economy's mycorrhizal system. They kept things functioning despite the plans, not because of them.

Friedrich Hayek (the Austrian economist whose 1945 essay "The Use of Knowledge in Society" remains the clearest articulation of the knowledge problem) had identified the core issue a decade before the Soviet cybernetic project began. The knowledge required to coordinate economic activity, Hayek argued, is not the kind that can be collected and processed centrally. It is dispersed across millions of individuals, each of whom knows things about their particular circumstances that no planner could ever aggregate. The price system works because it transmits this dispersed knowledge without requiring anyone to possess it all.

The Soviet planners were trying to replace distributed intelligence with centralised computation. To do so, they had to make the economy legible: reducible to the variables their models could process. But the act of making it legible fully ignored the informal networks and local knowledge that actually made production happen. Like the Prussian foresters, they optimised what they could see whilst neglecting what they could not.

Charles Goodhart (the British economist whose observation about monetary targets has become a general principle) identified one aspect of what goes wrong: when a measure becomes a target, it ceases to be a good measure. Factory managers rewarded for hitting production quotas will find ways to hit the quotas, whether or not this serves the underlying purpose. The measure loses its information content precisely because it has become a target. Donald Campbell (the psychologist whose parallel formulation is sometimes called Campbell's Law) put it more starkly: "The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor."

Complex systems cannot be optimised through measurement and control without distorting the very things being measured. Every model ultimately simplifies. Every simplification excludes something. And what gets excluded is often what makes the system work: the underground networks, the informal knowledge, the adaptive complexity that resists reduction to variables.

In The Overstory, Patricia Westerford eventually writes a book called The Secret Forest that makes her earlier findings accessible to general readers. People begin to see forests differently. Not as collections of individual trees competing for sunlight, but as communities coordinating through channels that had always been there, invisible only because no one had learned to look.

The real-world science has followed a similar trajectory. Suzanne Simard's research on mycorrhizal networks is now widely accepted. We know that forests communicate, share resources, and coordinate behaviour through underground systems that the scientific foresters of the eighteenth century could not have detected and would not have valued if they had.

Turning to the present. Artificial intelligence represents an incredibly powerful set of legibility tools. Machine learning systems can track, measure, and optimise human behaviour at a scale and granularity that would have seemed like science fiction a decade ago. In workplaces, these systems are already measuring keystrokes, analysing communication patterns, and scoring employees on metrics that update in real time. Organisations, like forests, have their own underground networks: the informal relationships through which real information flows, the workarounds that people develop when official processes don't work, the tacit knowledge about how things actually get done.

In Part 2 of this article we’ll look at what AI-driven optimisation means for the mycorrhizal systems of organisational life.

Next
Next

AI Transformation and the McNamara fallacy