Pruning by leveraging training dynamics
| Authors |
|
|---|---|
| Publication date | 2022 |
| Journal | AI Communications |
| Volume | Issue number | 35 | 2 |
| Pages (from-to) | 65-85 |
| Number of pages | 21 |
| Organisations |
|
| Abstract |
We propose a novel pruning method which uses the oscillations around 0, i.e. sign flips, that a weight has undergone during training in order to determine its saliency. Our method can perform pruning before the network has converged, requires little tuning effort due to having good default values for its hyperparameters, and can directly target the level of sparsity desired by the user. Our experiments, performed on a variety of object classification architectures, show that it is competitive with existing methods and achieves state-of-the-art performance for levels of sparsity of 99.6% and above for 2 out of 3 of the architectures tested. Moreover, we demonstrate that our method is compatible with quantization, another model compression technique. For reproducibility, we release our code at https://github.com/AndreiXYZ/flipout.
|
| Document type | Article |
| Note | In special issue: Highlights of AI Research in Europe. |
| Language | English |
| Published at | https://doi.org/10.3233/AIC-210127 |
| Other links | https://github.com/AndreiXYZ/flipout |
| Permalink to this page | |