2023-02-20 data

The myopia of metrics: when models go wrong (Part 10)

(This post is part of a series on working with data from start to finish.)

It may appear self-evident that we should avoid subscaling models, especially severely subscaling ones, which achieve parsimony only at a great loss in accuracy. Fortunately within the realm of science, this is generally the case: accurate models are vigorously defended despite their attendant complexity. In the spheres of commerce and politics however, accuracy is routinely crucified on the cross of parsimony.

In the book Seeing Like a State, anthropologist James C. Scott traces the early modern European state’s quest to understand, systematize and control its terrain and citizenry in service of growing the state treasury. In one example, Scott relates how states came to view their forests, not as diverse, ecological landscapes, but instead as monolithic breeding grounds for commercial timber. In carefully studying the conditions under which forests produced the most timber, states were able to formulate simplified, parsimonious and systematic rules of timber production. Soon, they reshaped forests to conform to those rules by slicing them into square grids, removing all underbrush, decimating animal habitats and evicting local inhabitants. Things which did not fit into the model did not belong in the model.

“The actual tree with its vast number of possible uses was replaced by an abstract tree representing a volume of lumber or firewood,” Scott writes (12). The state could no longer see the trees, nor the animals or people who all fell below its line of representation. The state only saw timber and its handful of inputs. Such a simplified view of the world naturally facilitated scale: “Increasing order in the forest made it possible for forest workers to use written training protocols that could be widely applied. A relatively unskilled and inexperienced labor crew could adequately carry out its tasks by following a few standard rules in the new forest environment” (18).

Image credit: DALL-E

This commercial model of forestry quickly became a paragon of efficient administration and rational thought, copied and scaled in region after region. At first, timber output boomed. But after around 100 years, it became increasingly apparent that the model was not sustainable. The diversity of fauna plummeted, plants were starved of their nutritious underbrush, and the uniformity of forests made them particularly susceptible to pestilence and fire. The ecological “capital” stored for millennia beneath the ground was rapidly depleted. Although the model worked well for a time, it did not scale well over time.

When models are first conceived, their “fit” to the real world, or lack thereof, is an entirely theoretical construct. It is in operationalizing this model that we witness the pernicious effect of poor model fit. Random error becomes real-world error; unexplained variance becomes collateral damage.

As an operating principle, we should be extremely skeptical of models which do not scale well. We might, for example, employ “scale cutoffs”, after which no more accuracy will be ceded. The pursuit of sweeping, parsimonious vision statements while completely neglecting accuracy is often practiced by disconnected leadership, inefficient bureaucracies, and even elegant science which fails replication. When given the choice between parsimony and accuracy, we should always choose accuracy.

This does not mean that we should tolerate a lack of parsimony willingly. In The Collapse of Complex Societies, Joseph Tainter makes the compelling argument that societies which do not deal with their complexity die from it. In “A Better Art Vocabulary”, Haley Thurston theorizes that it is in fact poor semantic compression, or a lack of parsimony, which generates the human perception of ugliness.

As with model building generally, we need to experiment with different models, both simple and complex, as well as different resolutions, both coarse-grained and fine-grained, to see what scales best. Too high a level of abstraction and you’ll miss the details; too low and you’ll miss the big picture. The best approach is to tinker with different models, constantly learning and adapting to the situation at hand.

(previous)(next)