Document Type

Article

Publication Date

2022

DOI

10.3390/cryptography6030034

Publication Title

Cryptography

Volume

6

Issue

3

Pages

34 (1-14)

Abstract

Medical data is frequently quite sensitive in terms of data privacy and security. Federated learning has been used to increase the privacy and security of medical data, which is a sort of machine learning technique. The training data is disseminated across numerous machines in federated learning, and the learning process is collaborative. There are numerous privacy attacks on deep learning (DL) models that attackers can use to obtain sensitive information. As a result, the DL model should be safeguarded from adversarial attacks, particularly in medical data applications. Homomorphic encryption-based model security from the adversarial collaborator is one of the answers to this challenge. Using homomorphic encryption, this research presents a privacy-preserving federated learning system for medical data. The proposed technique employs a secure multi-party computation protocol to safeguard the deep learning model from adversaries. The proposed approach is tested in terms of model performance using a real-world medical dataset in this paper.

Comments

© 2022 by the authors.

This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution 4.0 International (CC BY 4.0) license.

Original Publication Citation

Wibawa, F., Catak, F. O., Sarp, S., & Kuzlu, M. (2022). BFV-based homomorphic encryption for privacy-preserving CNN models. Cryptography, 6(3), 1-14, Article 34. https://doi.org/10.3390/cryptography6030034

ORCID

0000-0002-8719-2353 (Kuzlu)

Share

COinS