Extending Graph Condensation to Multi-Label Datasets

Training Graph Neural Networks (GNNs) on large graphs is often hindered by redundancy and high computational demands. Existing graph condensation methods typically target single-label scenarios, but many real-world graphs are multi-label—nodes can belong to multiple classes simultaneously. In this talk, I’ll introduce GCond: a graph condensation framework adapted to the multi-label setting, utilizing K-Center initialization and binary cross-entropy loss. Through experiments on eight real-world multi-label graph datasets, GCond consistently achieves state-of-the-art performance, enhancing both scalability and efficiency for multi-label graph learning.

 

About the speaker: Image removed.

Liangliang Zhang is a Ph.D. student in Computer Science at Rensselaer Polytechnic Institute, advised by Prof. Yao Ma. Her research focuses on graph neural networks, multi-label graph learning, and efficient algorithms for large-scale graph data.

 

 

For more information, please visit our website: Math Frontier Seminar Website.

Date
Location
Amos Eaton 216
Speaker: Liangliang (Lia) Zhang from Computer Science (RPI)
Back to top