Diachronic interpretability and machine learning systems


  • Sylvie Delacroix The University of Birmingham


interpretability, ethical agency, concept drift, contestability, fairness


If a system is interpretable today, why would it not be as interpretable in five or ten years time? Years of societal transformations can negatively impact the interpretability of some machine learning (ML) systems for two types of reasons. These two types of reasons are rooted in a truism: interpretability requires both an interpretable object and a subject capable of interpretation. This object versus subject perspective ties in with distinct rationales for interpretable systems: generalisability and contestability. On the generalisability front, when it comes to ascertaining whether the accuracy of some ML model holds beyond the training data, a variety of transparency and explainability strategies have been put forward. These strategies can make us blind to the fact that what an ML system has learned may produce helpful insights when deployed in real-life contexts this year yet become useless faced with next year’s socially transformed cohort. On the contestability front, ethically and legally significant practices presuppose the continuous, uncertain (re)articulation of conflicting values. Without our continued drive to call for better ways of doing things, these discursive practices would wither away. To retain such a collective ability calls for a change in the way we articulate interpretability requirements for systems deployed in ethically and legally significant contexts: we need to build systems whose outputs we are capable of contesting today, as well as in five years’ time. This calls for what I call ‘ensemble contestability’ features.

Reply by Zachary C. Lipton, Carnegie Mellon University.



26 January 2022
Total downloads

How to Cite

Delacroix, Sylvie. 2022. “Diachronic Interpretability and Machine Learning Systems”. Journal of Cross-Disciplinary Research in Computational Law 1 (2). https://journalcrcl.org/crcl/article/view/9.