Abstract
Over the last decade, the importance of machine learning increased dramatically in business and marketing. However, when machine learning is used for decision-making, bias rooted in unrepresentative datasets, inadequate models, weak algorithm designs, or human stereotypes can lead to low performance and unfair decisions, resulting in financial, social, and reputational losses. This paper offers a systematic, interdisciplinary literature review of machine learning biases as well as methods to avoid and mitigate these biases. We identified eight distinct machine learning biases, summarized these biases in the cross-industry standard process for data mining to account for all phases of machine learning projects, and outline twenty-four mitigation methods. We further contextualize these biases in a real-world case study and illustrate adequate mitigation strategies. These insights synthesize the literature on machine learning biases in a concise manner and point to the importance of human judgment for machine learning algorithms.
Original language | English |
---|---|
Pages (from-to) | 93-106 |
Number of pages | 14 |
Journal | Journal of Business Research |
Volume | 144 |
Early online date | 7 Feb 2022 |
DOIs | |
Publication status | Published - May 2022 |
Bibliographical note
Funding Information:Dennis Herhausen (Ph.D., University of St.Gallen) is Associate Professor of Marketing at Vrije Universiteit Amsterdam. His research, teaching, and executive education revolve around the themes of digital communication, customer journeys and experience, multichannel management, digital capabilities, and social media management. His work has been funded by national and international research grants, has received several awards, and is published in the Journal of Marketing, Journal of Marketing Research, Journal of the Academy of Marketing Science, and Harvard Business Review, among others.
Publisher Copyright:
© 2022 The Authors
Keywords
- Artificial intelligence
- Bias
- Case study
- Machine learning
- Mitigation methods