DFedKG: Diffusion-Based Federated Knowledge Graph Completion

Chao Zhao, Yurong Cheng*, Boyang Li, Yi Yang, Xue Wang

*此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

In recent years, the task of knowledge graph completion has attracted significant attention from researchers. In practical scenarios, multi-source knowledge graph completion is quite common. Federated knowledge graph embedding enables joint learning across multiple knowledge graphs while ensuring data privacy and security. Generally, each data source has a different data distribution. They may exhibit various connections, such as combinatorial, hierarchical, and symmetric/asymmetric connections. Existing federated knowledge graph models overlook the data heterogeneity of knowledge graphs from different sources, using a unified scoring function to assess the quality of the generated embedding vectors from different clients. This limitation affects the quality of knowledge graph embeddings generated by each client. Therefore, this paper proposes a federated knowledge graph embedding framework based on the diffusion model. On the client side, we employ diffusion model to learn knowledge graph embeddings. We utilize the diffusion model's forward noise-adding process to learn the knowledge graph's distribution. We then use the reverse denoising process to generate knowledge embeddings directly. Additionally, we employ knowledge distillation during client model training to address the drift between local optimization and global convergence. Since the original data cannot leave the local environment in federated learning, we adopt a framework that shares diffusion models for federated knowledge graph completion. Extensive experiments demonstrate that our model significantly outperforms existing state-of-the-art methods in three benchmark datasets.

源语言英语
期刊Data Science and Engineering
DOI
出版状态已接受/待刊 - 2025
已对外发布

指纹

探究 'DFedKG: Diffusion-Based Federated Knowledge Graph Completion' 的科研主题。它们共同构成独一无二的指纹。

引用此