We are ruled, in part, by algorithms. They govern some of what we see and read, how we communicate and work; this is widely understood. What is less obvious is how we should situate this elementary truth in socio-historical terms, or even what an algorithm actually is. We might usefully begin by defining the electronic computer as a deterministic but flexible device for modelling rules by which sets of symbols are reliably transformed into other sets of symbols. The simplicity of this definition is pointedly deflationary, but the centrality of symbols in human social life meant that these conceptually simple machines could have dramatic implications when socially generalized. As they have come to mediate everyday activities and interactions, the social world has been subordinated to the typically opaque rules that they embody. The symbols by which we live and think are increasingly governed by machines operating according to someone else’s rules. When did this phenomenon begin? With Google’s famous PageRank algorithm and the rise of social media, or does it have an older provenance?
In her latest book, Rules, the historian of science Lorraine Daston has presented a novel thesis on the role of ‘rules’ in modernity. Born in 1951 in East Lansing, Michigan—a small university city about 100 miles west of Detroit—Daston studied history and science at Harvard, completing a PhD titled ‘The Reasonable Calculus: Classical Probability Theory, 1650–1840’ in 1979 under the supervision of Newton scholar I. Bernard Cohen—a key figure in the emergence of history of science as an academic field in the us. Published in book form with Princeton in 1988, Daston’s doctoral project traced the arc of mathematical probability theory in figures like Bernoulli, Condorcet and Laplace, who attempted to formalize probabilistic rules underlying the judgements of the rational subject. In Condorcet’s hands, this formed the basis for ‘social mathematics’—an early attempt at social science. Founded on elegant formalisms that often led to absurd conclusions, viewed from a certain angle, this kind of probabilistic theory can appear as an early ancestor of the post-war fetish for formalism and model-building in the Anglophone social sciences. Daston located its decline in the emergence of a statistical worldview that did not depend on discredited assumptions about the micro-level reasonableness of individual behaviour. ‘What does it mean to be rational?’ was her opening question; she is still grappling with the history of answers to this question.
Daston’s early work contributed to a wave of scholarship on probability and statistics that played a notable role in post-Kuhnian history of science, roughly from Ian Hacking’s pathbreaking Emergence of Probability in 1976, to its 1990 sequel, The Taming of Chance. Though technical, these topics proved anything but narrow, bearing upon—among other things—notions of inference and scientific method, insurance and the quantification of risk, the emergence of the modern state, and the eugenics movement. Foucault was an important—albeit distal—influence here, supplying a precedent for a kind of epistemic history, of the basic structures of knowledge. But while such thinking has often been philosophically informed, it has been little concerned with ‘French theory’: Hacking was an analytic philosopher turned historian; Daston set out as a historian of mathematics. In the 1980s she was a member of a research group at the University of Bielefeld assembled by historian and philosopher of science Lorenz Krüger. She has been based in Germany since: in 1994, Krüger became founding director of the Max Planck Institute for the History of Science in Berlin, but when he died soon after, Daston assumed the directorship, remaining in that post until 2019. There she convened other scholars for collaborative research, leading to some collectively authored texts, including How Reason Almost Lost Its Mind (2013): a study of prominent attempts to redefine rationality in formal and mechanical terms in Cold War American economics, political science, psychology and sociology, which anticipated some themes of her latest book. Under Daston, the Institute’s Working Group has been associated with a ‘historical epistemology’ focused on fundamental categories of science such as objectivity and observation—an approach reflected in her own writing.
Published in 1998, Wonders and the Order of Nature, 1150–1750, co-authored with Katharine Park, a historian of medieval and renaissance science, studied the role that wonders, marvels and prodigies—strange phenomena such as comets, monstrous births or a luminescent veal shank seen by the scientist Robert Boyle—played in bounding notions of natural order, until they became associated with civil and religious turbulence in the early modern period, and were ultimately suppressed by Enlightenment intellectuals in favour of the regular and lawlike. Her most conceptually striking book is Objectivity (2007)—again co-authored, with historian of physics and philosopher of science Peter Galison—which analysed changes in scientific image-making as exemplified in the atlases that have been central in defining individual specialisms over hundreds of years. In these images, Daston and Galison perceived shifts in the notion of scientific truth, from a truth-to-nature which aimed to capture essential characteristics—in the illustrations of Linnaean botanical classifications, for example—to an objectivity which attempted to obliterate the subjectivity of the scientist—often by mechanical means such as photography—to the trained judgement of credentialled specialists. If technology and changes in scientific labour were often prominent in these shifts, Daston and Galison were at pains to identify preceding changes in mentalité, rather than appealing to any simple transformation in what Marxists used to call the ‘base’. In 2019 she made a foray into philosophical anthropology with Against Nature, a short book on the relation of moral and natural order, which argued against transcendent notions of reason for one ‘embedded in the specifics of the human sensorium’.
If Rules draws from the same toolkit, it is also a departure: a philosophical history that ‘hopscotches’ over the centuries since antiquity to construct an argument about the shifting relationships between rules and exceptions, universals and particulars. Based on a lecture series delivered at Princeton in 2014, it is more conversational in tone than much of her earlier work, but does not fully cross over into the popular history of science and technology: conceptual and scholarly in temperament, various aspects of Daston’s argument require some degree of contextual knowledge to be properly understood. It has three declared aims: firstly, to shed light on ‘how mathematical algorithms intersected with political economy during the Industrial Revolution’; secondly, ‘to reconstruct the lost coherence of the category of rule that could for so long and apparently without any sense of contradiction embrace meanings that now seem antonymical to each other’—not just algorithm, law and regulation, but also model and paradigm; and thirdly, ‘to examine how rules were framed in order to anticipate and facilitate bridge-building between universal and particulars’—which is to say, Daston considers questions such as how general rules might historically have been related to specific cases. The argument is structured schematically by three oppositions: rules, according to Daston, can be thick and thin, flexible and rigid, general and specific. As history, the book is confined to the West, although other traditions are touched upon in places; as theory it seems to aspire to a broader scope. In structure, it is partly thematic, partly chronological, moving through the centuries before cutting back again.
The ancient Greek word, kanon, which referred to rods and straightedges typically used in construction, was also applied to Pythagorean music theory, the sculptor Polykleitos’s specifications for the ideal male body, Ptolemy’s tables for astronomical computation, and physical architectural models; by the Hellenistic period, it was applied to exemplary orators and poets; early Christians used it to refer to the gospels and other scripture, the decrees that ordered religious life and ultimately canon law. The Latin regula had much the same connotations, but also related to reasoning by precedent, in the context of Roman law. According to Daston, three principal semantic clusters can be perceived here: measurement and calculation; models or paradigms; laws and regulations. The puzzle that she sets out to solve in Rules is the relative eclipse of the second: if measurement and calculation, laws and regulations are still with us, the ancient notion of the model or paradigm persisted into early modernity, only—according to Daston—to fall into neglect around the turn of the nineteenth century. In addition to changing dictionary definitions, Daston finds signs of this decline in much-discussed perplexities about the role of rules in Thomas Kuhn’s notion of scientific paradigms—can a paradigm be rendered completely explicit, for example?—and in Wittgenstein’s famous question about how it is possible to follow even mathematical rules without an infinite regress of interpretation, where some meta-rule is required to specify how to follow every rule; significant too, is the answer he ultimately found in ‘customs’.
Since it has, according to Daston, become hard to understand this ancient concept of the model or paradigm, she sets about reconstructing it, starting with a case study of the Rule of Saint Benedict—a fifth- or sixth-century book of precepts for the communal living of Benedictine monks. Although fine-grained, these depend upon the discretion of the abbot, who is himself supposed to exemplify life according to the Rule—he is ‘the rule of the Rule’. This is a matter not of following some rigid procedure, but of freely making nuanced distinctions; of moving from particular to particular via analogy, with a model as the basis; of honouring principles rather than literalistically following prescriptions. For Daston, the ancient ‘home’ of rules was in technê or ars: ‘fields guided by precepts but responsive to the vicissitudes of practice’, engaging both head and hand, form and matter, as opposed to the universal and necessary truths of epistêmê—though Aristotle recognized a continuum between these poles, with technê also involving ‘reasoning from causes and achieving some degree of generality’.