A Framework to Optimize the Energy Cost of Securing Neural Network Inference
| Authors | |
|---|---|
| Publication date | 2024 |
| Book title | IEEE Congress on Cybermatics: 2024 IEEE International Conferences on Internet of Things (iThings), IEEE Green Computing and Communications (GreenCom), IEEE Cyber, Physical and Social Computing (CPSCom), IEEE Smart Data (SmartData) |
| Book subtitle | Cybermatics 2024: iThings 2024 GreenCom 2024 CPSCom 2024 SmartData 2024 : 19-22 August 2024, Copenhagen, Denmark : proceedings |
| ISBN |
|
| ISBN (electronic) |
|
| Event | IEEE Congress on Cybermatics 2024 |
| Pages (from-to) | 339–346 |
| Publisher | Los Alamitos, California: IEEE Computer Society |
| Organisations |
|
| Abstract |
With the rise of deep neural networks (NNs), Machine Learning as a Service (MLaaS) gained significant attention. In MLaaS, a service provider offers inference with a pre-trained NN to clients, allowing them to obtain the inference output without the need for computationally intensive training. However, MLaaS introduces data privacy and security concerns for both clients and service providers. Clients’ sensitive data in the input and output must be kept private, and the NN representing the service provider’s intellectual property should not be shared with clients. Secure Neural Network Inference (SNNI) addresses these concerns by ensuring that the client learns only the output, and the service provider remains oblivious to the input and output.Many SNNI approaches proposed in recent years are mainly based on advanced cryptographic techniques such as Secure Multiparty Computation (MPC) and Homomorphic Encryption (HE). These approaches incur an overhead due to the underlying cryptographic primitives, making them significantly more compute-intensive and thus more energy-hungry than conventional inference. Optimization approaches for SNNI mainly focused on maximizing accuracy and minimizing execution time. However, amidst growing climate concerns, energy consumption becomes a crucial aspect when determining optimal deployments of SNNI. Thus, conducting a comprehensive investigation into energy-friendly SNNI approaches remains an open challenge.We design and develop a framework for determining the optimal deployment of SNNI. Given a machine learning (ML) inference task, the framework selects the SNNI approach and NN that minimize energy consumption while considering additional constraints, such as the desired level of accuracy and execution time. This knowledge-based framework distills information from experiments involving combinations of SNNIs and NNs, subsequently identifying the best deployment option, i.e., the (near-) optimal choice of SNNI and NN, based on the client’s request.
|
| Document type | Conference contribution |
| Language | English |
| Published at | https://doi.org/10.1109/ithings-greencom-cpscom-smartdata-cybermatics62450.2024.00073 |
| Other links | https://www.proceedings.com/77138.html |
| Downloads |
A_Framework_to_Optimize_the_Energy_Cost_of_Securing_Neural_Network_Inference
(Final published version)
|
| Permalink to this page | |
