A full house: applying the GDPR to vocal assistants

Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

70 Downloads (Pure)

Abstract

Vocal assistants such as Alexa or Google Assistant are popular, easy to use, and so convenient they are almost addictive. They are placed inside the home, the sacred precinct of the private sphere, although their functionality depends entirely on data collection, processing, and profiling. This article analyses vocal assistants from two angles. On one hand, it shows how vocal assistants create new apertures in the private sphere, making it more permeable and transparent to corporate surveillance. They do so leveraging their sensors and seamless vocal interaction, but also persuasive design techniques that prompt individuals to use them more, sharing more data. Subsequently, the article discusses how selected provisions of the GDPR can be applied to vocal assistants, to mitigate the effects Alexa & company have on the opacity and permeability of the private sphere. Particular attention is given in this regard to, among others, the role of consent, the application of privacy by design, and the provision concerning automated decisions and profiling. The inherent features of vocal assistants appear difficult to reconcile with these provisions, and new problematic aspects emerge with regard to all of them. Possible solutions to these new challenges are, however, on the horizon, and are discussed in the conclusions of the article.
Original languageEnglish
Title of host publicationBILETA 2021
Publication statusUnpublished - 2021

Keywords

  • GDPR
  • smart speakers
  • data protection

Fingerprint

Dive into the research topics of 'A full house: applying the GDPR to vocal assistants'. Together they form a unique fingerprint.

Cite this