HedgeCut: Maintaining Randomised Trees for Low-Latency Machine Unlearning
| Authors |
|
|---|---|
| Publication date | 2021 |
| Book title | SIGMOD '21 |
| Book subtitle | proceedings of the 2021 International Conference on the Management of Data : June 20 -25, 2021, virtual event, China |
| ISBN (electronic) |
|
| Event | 2021 International Conference on the Management of Data |
| Pages (from-to) | 1545–1557 |
| Publisher | New York, NY: Association for Computing Machinery |
| Organisations |
|
| Abstract |
Software systems that learn from user data with machine learning (ML) have become ubiquitous over the last years. Recent law such as the "General Data Protection Regulation" (GDPR) requires organisations that process personal data to delete user data upon request (enacting the "right to be forgotten"). However, this regulation does not only require the deletion of user data from databases, but also applies to ML models that have been learned from the stored data. We therefore argue that ML applications should offer users to unlearn their data from trained models in a timely manner. We explore how fast this unlearning can be done under the constraints imposed by real world deployments, and introduce the problem of low-latency machine unlearning: maintaining a deployed ML model in-place under the removal of a small fraction of training samples without retraining.We propose HedgeCut, a classification model based on an ensemble of randomised decision trees, which is designed to answer unlearning requests with low latency. We detail how to efficiently implement HedgeCut with vectorised operators for decision tree learning. We conduct an experimental evaluation on five privacy-sensitive datasets, where we find that HedgeCut can unlearn training samples with a latency of around 100 microseconds and answers up to 36,000 prediction requests per second, while providing a training time and predictive accuracy similar to widely used implementations of tree-based ML models such as Random Forests.
|
| Document type | Conference contribution |
| Note | With supplementary material |
| Language | English |
| Published at | https://doi.org/10.1145/3448016.3457239 |
| Downloads |
3448016.3457239
(Final published version)
|
| Supplementary materials | |
| Permalink to this page | |
