TY - JOUR
T1 - DFedKG
T2 - Diffusion-Based Federated Knowledge Graph Completion
AU - Zhao, Chao
AU - Cheng, Yurong
AU - Li, Boyang
AU - Yang, Yi
AU - Wang, Xue
N1 - Publisher Copyright:
© The Author(s) 2025.
PY - 2025
Y1 - 2025
N2 - In recent years, the task of knowledge graph completion has attracted significant attention from researchers. In practical scenarios, multi-source knowledge graph completion is quite common. Federated knowledge graph embedding enables joint learning across multiple knowledge graphs while ensuring data privacy and security. Generally, each data source has a different data distribution. They may exhibit various connections, such as combinatorial, hierarchical, and symmetric/asymmetric connections. Existing federated knowledge graph models overlook the data heterogeneity of knowledge graphs from different sources, using a unified scoring function to assess the quality of the generated embedding vectors from different clients. This limitation affects the quality of knowledge graph embeddings generated by each client. Therefore, this paper proposes a federated knowledge graph embedding framework based on the diffusion model. On the client side, we employ diffusion model to learn knowledge graph embeddings. We utilize the diffusion model's forward noise-adding process to learn the knowledge graph's distribution. We then use the reverse denoising process to generate knowledge embeddings directly. Additionally, we employ knowledge distillation during client model training to address the drift between local optimization and global convergence. Since the original data cannot leave the local environment in federated learning, we adopt a framework that shares diffusion models for federated knowledge graph completion. Extensive experiments demonstrate that our model significantly outperforms existing state-of-the-art methods in three benchmark datasets.
AB - In recent years, the task of knowledge graph completion has attracted significant attention from researchers. In practical scenarios, multi-source knowledge graph completion is quite common. Federated knowledge graph embedding enables joint learning across multiple knowledge graphs while ensuring data privacy and security. Generally, each data source has a different data distribution. They may exhibit various connections, such as combinatorial, hierarchical, and symmetric/asymmetric connections. Existing federated knowledge graph models overlook the data heterogeneity of knowledge graphs from different sources, using a unified scoring function to assess the quality of the generated embedding vectors from different clients. This limitation affects the quality of knowledge graph embeddings generated by each client. Therefore, this paper proposes a federated knowledge graph embedding framework based on the diffusion model. On the client side, we employ diffusion model to learn knowledge graph embeddings. We utilize the diffusion model's forward noise-adding process to learn the knowledge graph's distribution. We then use the reverse denoising process to generate knowledge embeddings directly. Additionally, we employ knowledge distillation during client model training to address the drift between local optimization and global convergence. Since the original data cannot leave the local environment in federated learning, we adopt a framework that shares diffusion models for federated knowledge graph completion. Extensive experiments demonstrate that our model significantly outperforms existing state-of-the-art methods in three benchmark datasets.
KW - Diffusion model
KW - Federated learning
KW - Knowledge graph completion
UR - http://www.scopus.com/pages/publications/105007714900
U2 - 10.1007/s41019-025-00292-z
DO - 10.1007/s41019-025-00292-z
M3 - Article
AN - SCOPUS:105007714900
SN - 2364-1185
JO - Data Science and Engineering
JF - Data Science and Engineering
ER -