Forget Me If You Can: Auditing User Data Revocation in Recommendation Systems
Z. Q. Zhu et al.
Abstract
Research Spotlight: Auditing the “Right to Be Forgotten” in AI Recommender Systems Personalized recommendation systems power today’s e-commerce and streaming platforms, but they also raise regulatory concerns about user privacy. Under regulations such as the General Data Protection Regulation (GDPR), individuals have the “right to be forgotten.” Whereas companies can remove user records from databases, it remains unclear whether trained AI models truly forget the behavioral patterns or preferences learned from that data. If not, platforms may continue profiling users even after deletion requests. This study introduces RecAudit, a practical auditing tool that tests whether a recommender system has genuinely removed a user’s influence from its trained model. RecAudit acts as an independent verification mechanism that identifies users whose behavioral traces persist after data deletion. Across multiple real-world data sets, RecAudit substantially outperforms existing auditing and membership inference methods. By enabling organizations to detect high-risk cases and target corrective machine unlearning efforts, RecAudit provides a concrete technical pathway to operationalize data revocation rights. The framework supports platform operators and regulators in strengthening accountability and ensuring AI systems comply with evolving data protection laws.
Evidence weight
Balanced mode · F 0.40 / M 0.15 / V 0.05 / R 0.40
| F · citation impact | 0.50 × 0.4 = 0.20 |
| M · momentum | 0.50 × 0.15 = 0.07 |
| V · venue signal | 0.50 × 0.05 = 0.03 |
| R · text relevance † | 0.50 × 0.4 = 0.20 |
† Text relevance is estimated at 0.50 on the detail page — for your query’s actual relevance score, open this paper from a search result.