GuardML: Efficient Privacy-Preserving Machine Learning Services Through Hybrid Homomorphic Encryption

GuardML: Efficient Privacy-Preserving Machine Learning Services Through Hybrid Homomorphic Encryption

April 8–12, 2024 | Eugene Frimpong, Khoa Nguyen, Mindaugas Budzys, Tanveer Khan, Antonis Michalas
**GuardML: Efficient Privacy-Preserving Machine Learning Services Through Hybrid Homomorphic Encryption** **Authors:** Eugene Frimpong **Abstract:** Machine Learning (ML) has emerged as a transformative domain in data science, but the increasing number of malicious attacks on ML models raises privacy concerns. Privacy-Preserving Machine Learning (PPML) methods, such as Homomorphic Encryption (HE), have been introduced to address these concerns. However, traditional HE schemes are impractical for highly scalable scenarios due to their significant drawbacks and inefficiencies. Hybrid Homomorphic Encryption (HHE) combines the strengths of symmetric cryptography and HE, making it more accessible and efficient. This paper introduces HHE to ML by designing a PPML scheme tailored for end devices. The proposed approach leverages HHE to enable secure learning of classification outcomes over encrypted data while preserving the privacy of input data and the ML model. The real-world applicability of the construction is demonstrated through an HHE-based PPML application for classifying heart disease based on sensitive ECG data. Evaluations show a slight reduction in accuracy compared to plaintext data, but minimal communication and computation costs for both the analyst and end devices, highlighting the practical viability of the approach. **Keywords:** - Hybrid Homomorphic Encryption - Machine Learning as a Service - Privacy-Preserving Machine Learning **Contributions:** 1. Demonstrates the effective utilization of HHE to tackle PPML challenges. 2. Presents two formally designed protocols for processing encrypted data efficiently. 3. Shows the practicality of the protocol in a real-world ML scenario using a sensitive medical dataset. **System Model:** - User: Generates symmetric keys and encrypts data. - Cloud Service Provider (CSP): Processes encrypted data and performs operations. - Analyst: Owns an ML model and requests predictions on encrypted data. **Protocols:** - **2GML:** A 2-party protocol for a CSP and a user. - **3GML:** An extended version for a set of users, a CSP, and an analyst. **Security Analysis:** - Proves the security of the protocols against ciphertext substitution and ML model unauthorized access attacks. **Experiments:** - Evaluates the PPML application on a sensitive heartbeat dataset. - Measures computational and communication costs for different scenarios. - Compares performance with a plain BFV scheme. **Conclusion:** This paper presents a realistic solution for PPML using HHE, balancing ML functionality and privacy. The approach overcomes the limitations of traditional PPML methods, making it suitable for constrained devices and a wide range of applications.**GuardML: Efficient Privacy-Preserving Machine Learning Services Through Hybrid Homomorphic Encryption** **Authors:** Eugene Frimpong **Abstract:** Machine Learning (ML) has emerged as a transformative domain in data science, but the increasing number of malicious attacks on ML models raises privacy concerns. Privacy-Preserving Machine Learning (PPML) methods, such as Homomorphic Encryption (HE), have been introduced to address these concerns. However, traditional HE schemes are impractical for highly scalable scenarios due to their significant drawbacks and inefficiencies. Hybrid Homomorphic Encryption (HHE) combines the strengths of symmetric cryptography and HE, making it more accessible and efficient. This paper introduces HHE to ML by designing a PPML scheme tailored for end devices. The proposed approach leverages HHE to enable secure learning of classification outcomes over encrypted data while preserving the privacy of input data and the ML model. The real-world applicability of the construction is demonstrated through an HHE-based PPML application for classifying heart disease based on sensitive ECG data. Evaluations show a slight reduction in accuracy compared to plaintext data, but minimal communication and computation costs for both the analyst and end devices, highlighting the practical viability of the approach. **Keywords:** - Hybrid Homomorphic Encryption - Machine Learning as a Service - Privacy-Preserving Machine Learning **Contributions:** 1. Demonstrates the effective utilization of HHE to tackle PPML challenges. 2. Presents two formally designed protocols for processing encrypted data efficiently. 3. Shows the practicality of the protocol in a real-world ML scenario using a sensitive medical dataset. **System Model:** - User: Generates symmetric keys and encrypts data. - Cloud Service Provider (CSP): Processes encrypted data and performs operations. - Analyst: Owns an ML model and requests predictions on encrypted data. **Protocols:** - **2GML:** A 2-party protocol for a CSP and a user. - **3GML:** An extended version for a set of users, a CSP, and an analyst. **Security Analysis:** - Proves the security of the protocols against ciphertext substitution and ML model unauthorized access attacks. **Experiments:** - Evaluates the PPML application on a sensitive heartbeat dataset. - Measures computational and communication costs for different scenarios. - Compares performance with a plain BFV scheme. **Conclusion:** This paper presents a realistic solution for PPML using HHE, balancing ML functionality and privacy. The approach overcomes the limitations of traditional PPML methods, making it suitable for constrained devices and a wide range of applications.
Reach us at info@study.space
[slides and audio] GuardML%3A Efficient Privacy-Preserving Machine Learning Services Through Hybrid Homomorphic Encryption