A Novel Privacy-Preserving Decentralized Federated Learning Approach for Incremental Healthcare Data
Abstract
Healthcare institutions face significant challenges in implementing machine learning solutions, particularly in environments with continuous data streams and privacy constraints. Traditional approaches struggle to balance effective model training with data privacy, especially when dealing with varying data volumes across different institutions and the need for continuous adaptation to new medical knowledge. This paper presents a novel decentralized federated continual learning system that enables privacy-preserving collaboration among healthcare institutions without central coordination. Our approach combines Generative Adversarial Networks (GANs) and Deep Stacking Networks (DSNs) in a fully connected network topology, where each node employs a GAN for synthetic data generation and a DSN for classification tasks. The system processes streaming data while maintaining privacy through federated parameter sharing, allowing institutions to benefit from collective knowledge without exposing sensitive patient data. Experimental validation on the MIMIC-VI-ED dataset demonstrates that our approach successfully addresses data volume disparities between institutions, enabling smaller healthcare centers to achieve performance levels comparable to larger institutions. The system demonstrates robust performance comparable to state-of-the-art centralized approaches while providing crucial advantages in terms of data privacy preservation, institutional collaboration, and dynamic data processing capabilities.
Downloads
Copyright (c) 2025 ITEGAM-JETIA

This work is licensed under a Creative Commons Attribution 4.0 International License.








