Rethinking Explainable Machines: The GDPR�s �Right to Explanation� Debate and the Rise of Algorithmic Audits in Enterprise

Bryan Casey

Berkeley Technology Law Journal2019https://doi.org/10.15779/z38m32n986article
ABDC A
Weight
0.55

Abstract

The public debate surrounding the General Data Protection Regulation’s (GDPR) “right to explanation” has sparked a global conversation of profound social and economic significance. But from a practical perspective, the debate’s participants have gotten ahead of themselves. In their search for a revolutionary new data protection within the provisions of a single chapter of the GDPR, many prominent contributors to the debate have lost sight of the most revolutionary change ushered in by the Regulation: the sweeping new enforcement powers given to European data protection authorities (DPAs) by Chapters 6 and 8 of the Regulation. Unlike the 1995 Data Protection Directive that it replaced, the GDPR’s potent new investigatory, advisory, corrective, and punitive powers granted by Chapters 6 and 8 render DPAs de facto interpretive authorities of the Regulation’s controversial “right to explanation.” Now that the DPAs responsible for enforcing the right have officially weighed in, this Article argues that at least one matter of fierce public debate can be laid to rest. The GDPR provides a muscular “right to explanation” with sweeping legal implications for the design, prototyping, field testing, and deployment of automated data processing systems. The protections enshrined within the right may not mandate transparency in the form of a complete individualized explanation. But a holistic understanding of the interpretation by DPAs reveals that the right’s true power derives from its synergistic effects when combined with the algorithmic auditing and “data protection by design” methodologies codified by the Regulation’s subsequent chapters. Accordingly, this Article predicts that algorithmic auditing and “data protection by design” practices will likely become the new gold standard for enterprises deploying machine learning systems both inside and outside of the European Union.

24 citations

Open via your library →

Cite this paper

https://doi.org/https://doi.org/10.15779/z38m32n986

Or copy a formatted citation

@article{bryan2019,
  title        = {{Rethinking Explainable Machines: The GDPR�s �Right to Explanation� Debate and the Rise of Algorithmic Audits in Enterprise}},
  author       = {Bryan Casey},
  journal      = {Berkeley Technology Law Journal},
  year         = {2019},
  doi          = {https://doi.org/https://doi.org/10.15779/z38m32n986},
}

Paste directly into BibTeX, Zotero, or your reference manager.

Flag this paper

Rethinking Explainable Machines: The GDPR�s �Right to Explanation� Debate and the Rise of Algorithmic Audits in Enterprise

Flags are reviewed by the Arbiter methodology team within 5 business days.


Evidence weight

0.55

Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40

F · citation impact0.52 × 0.4 = 0.21
M · momentum0.77 × 0.15 = 0.12
V · venue signal0.50 × 0.05 = 0.03
R · text relevance †0.50 × 0.4 = 0.20

† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.