New working paper on uncertainty

I’m happy to announce that my reflection on “Uncertainty as an Unavoidable Good” was published in Bielefeld University’s Center for Uncertainty Studies Working Papers series: https://doi.org/10.4119/unibi/2983506

Abstract:

In digital history, uncertainty is generally regarded as an unavoidable evil. One generally aims to reduce—and ideally resolve—uncertainty in data as much as possible. However, information systems are not designed to handle the absence of information; we discuss how both SQL’s seemingly simple Null marker and the TEI Guideline’s elaborate facilities for recording “certainty” fail to address the challenges posed by uncertainty. Neither is big data and a “digital historical positivism” a satisfactory answer: the causal models that underpin historical narratives do not simply emerge from a collection of facts. Here, it is necessary to distinguish between two types of uncertainty: historical uncertainty, which concerns the facts of the past, and historiographical uncertainty, which concerns the causal models constructed by historians. The latter results from different interpretations of the causal relations between the facts; given our limited knowledge of the past, it is ultimately irreducible. But it is also this uncertainty that allows us to construct the narratives we need for sense-making. We argue that in this sense uncertainty may be regarded as an unavoidable good and that we should aim to design computational frameworks that treat it as an asset rather than an obstacle.