Abstract
The article explores to what extent emotion recognition technologies represents an interference with the right to private life under the ECHR and EU Charter of Fundamental Rights.
Emotion recognition consists of using AI techniques to infer human emotional states from facial (micro) expressions, muscle movements, speech, or audio signals. The applications of emotion recognition range from commercial (driver fatigue detection, profiling for personalization in smart environments, targeted advertising, political campaigning, movie rating, etc) to state surveillance (lie detection in airports, fraud detection, public security, anti-terrorism, crime prevention). While AI experts continue investigating the potential of emotion detection, psychologists and sociologists are divided, with experts warning against it, highlighting the lack of scientific consensus on the very definition of emotion, and the pollution created by pseudo-sciences on the matter.
The contribution of this article is twofold. It challenges the idea that AI-powered emotion recognition can be effectively regulated under Freedom of religion and thought. The vast ECtHR and CJEU case-law concerning the right to private life (especially if paired with the one on personal data protection) offers a strong baseline for the protection of the most inner aspects of individual ‘privateness’. The article shows how the right to private life and the right to personal data protection apply to emotion recognition. It concludes discussing what legal requirements and safeguards must be established for their concrete application to both commercial and state use of emotion recognition.
Emotion recognition consists of using AI techniques to infer human emotional states from facial (micro) expressions, muscle movements, speech, or audio signals. The applications of emotion recognition range from commercial (driver fatigue detection, profiling for personalization in smart environments, targeted advertising, political campaigning, movie rating, etc) to state surveillance (lie detection in airports, fraud detection, public security, anti-terrorism, crime prevention). While AI experts continue investigating the potential of emotion detection, psychologists and sociologists are divided, with experts warning against it, highlighting the lack of scientific consensus on the very definition of emotion, and the pollution created by pseudo-sciences on the matter.
The contribution of this article is twofold. It challenges the idea that AI-powered emotion recognition can be effectively regulated under Freedom of religion and thought. The vast ECtHR and CJEU case-law concerning the right to private life (especially if paired with the one on personal data protection) offers a strong baseline for the protection of the most inner aspects of individual ‘privateness’. The article shows how the right to private life and the right to personal data protection apply to emotion recognition. It concludes discussing what legal requirements and safeguards must be established for their concrete application to both commercial and state use of emotion recognition.
Original language | English |
---|---|
Publication status | Published - Apr 2024 |
Event | BILETA - Dublin City University, Dublin, Ireland Duration: 18 Apr 2024 → 19 Apr 2024 |
Conference
Conference | BILETA |
---|---|
Abbreviated title | BILETA |
Country/Territory | Ireland |
City | Dublin |
Period | 18/04/24 → 19/04/24 |
Keywords
- Emotion Recognition Technology
- Privacy
- Human Rights