Cop, Code, and Conduct: A practice-based understanding of responsible policing in the algorithmic age

Research output: PhD ThesisPhD-Thesis – Research and graduation external

Abstract

This dissertation provides a practice-based understanding of responsible algorithmization in the critical government domain of policing. The rise of algorithmic systems is transforming our society. Algorithms, as explored in this dissertation, are not merely technical tools but complex systems deeply embedded in social, material and organizational contexts. I refer to algorithmic systems to denote both the technological artefact as well as the wider sociomaterial system surrounding it. As organizations adopt algorithmic systems, organizational routines are fundamentally transformed. This process, known as algorithmization, has the potential to enhance efficiency and decision-making but also carries inherent risks due to algorithms’ subjectivity and the potential to perpetuate existing biases. These concerns are particularly pressing in policing, where decisions made by algorithms can have profound impact on individuals and communities. To navigate these challenges, it is crucial to implement these systems in a responsible manner by putting public values—such as fairness, transparency, and privacy— centre stage. I refer to this as responsible algorithmization. This dissertation explores responsible algorithmization within the Netherlands Police, offering a practice-based understanding of how algorithmic systems are designed and used in real-world policing. In a domain where algorithmic decisions can become a matter of life and death, safeguarding public values is essential. In a total of four ethnographic studies to research the design and use of algorithms at the Netherlands Police. With regards to the design of algorithmic systems, the ethnographic work focuses on data professionals within the Netherlands Police, who create these systems, as well as the data they work with. Data professionals are motivated to safeguard public values, but struggle to translate these into concrete design practices. Police reports, which comprise an important data source, are shaped by street-level officers, organizational dynamics, and practical concerns, complicating responsible design. The research further examines the challenges that arise during the use of algorithmic systems. It examines practices of working with a specific algorithmic camera system (MONOcam), as well as the organizational side of implementing responsible algorithmization practices. Placing public values center stage during design is no guarantee for responsible use, as values may be renegotiated. Responsible algorithmization is not just about the design phase but requires an ongoing organizational effort to address these challenges through a combination of traditional bureaucratic responses, long-term organizational changes, and short-term pragmatic solutions. The dissertation highlights that responsible algorithmization is a fundamentally human thing. Actors across an organization influence how algorithms work in practice, and how public values can be safeguarded. Responsibility must therefore be shared. The dissertation shows that responsible algorithmization is an ongoing, evolving process that requires continuous improvement and adaptation. It requires organizational commitment and effort. Rather than a one-size-fits-all solution, organizations should work to embed responsible algorithmization into their culture, fostering collaboration and flexibility. By viewing algorithmic systems as dynamic and interconnected, this research provides valuable insights for public sector organizations seeking to integrate algorithms responsibly, ensuring these technologies contribute positively society.
Original languageEnglish
Print ISBNs978-90-393-7839-7
DOIs
Publication statusPublished - 16 May 2025

Fingerprint

Dive into the research topics of 'Cop, Code, and Conduct: A practice-based understanding of responsible policing in the algorithmic age'. Together they form a unique fingerprint.

Cite this