The GDPR aims to control the risks associated with the processing of personal data. It requires measures to minimise these risks and gives data subjects certain powers, such as the rights to be informed and to be forgotten. Big data is a relatively new technology, giving the controllers of data the power to permanently observe the users of digital services. Therefore this thesis answers the question whether the GDPR is suited to avert the risks and power shifts associated with big data. To answer this question, the GDPR is compared to earlier EU legislation associated with technological risks and power shifts. Additionally, the suitability of the GDPR’s anti-discrimination provisions are evaluated for the prevention of algorithmic discrimination. Results: The GDPR is not based on any discernible analysis of the risks of big data. Methods from EU environmental protection law and consumer protection law, aimed at technological risks and power shifts, were not applied. This can make evaluation of the GDPR’s effectiveness more difficult and could stand in the way of developing a coherent body of case law. The conclusion proposes a number of guidelines for the decision of court cases and points for evaluating the GDPR.
|Publication status||Published - 12 Sep 2019|