Abstract
Dynamic signed directed networks are common in our daily lives so that representation learning in such networks is critical for many downstream tasks. However, most existing works only focus on either temporal or signed information in the graph. Moreover, taking account of both temporal and signed information into representation learning still faces some challenges. Firstly, edge sign between nodes could change (i.e. from positive to negative) over some time and this will bring the conflict in learning process. Secondly, balance theory and status theory that prior models based on are not suitable to learn signed semantics for dynamic signed network anymore. To tackle these challenges, we propose a novel clip-based Dynamic Signed Directed Graph Neural Networks model (DySDGNN). In DySDGNN, we design three components: the Attention-based TLSTM layer (A-TLSTM) to capture the different kinds of information within each clip, the Masked Temporal Self-Attention layer (MT-SA) to aggregate node embeddings across clips and the loss function related to the Signed Triad Transition Matrix (STTM) to measure the impact of different triads. We conduct extensive experiments and comparisons on three real-world datasets. The experimental results prove the superiority of DySDGNN compared with SOTA baselines on the downstream task of link sign prediction.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Baytas, I.M., Xiao, C., Zhang, X., Wang, F., Jain, A.K., Zhou, J.: Patient subtyping via time-aware lstm networks. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 65–74 (2017)
Dang, Q.V., Ignat, C.L.: Link-sign prediction in dynamic signed directed networks. In: 2018 IEEE 4th International Conference on Collaboration and Internet Computing (CIC), pp. 36–45. IEEE (2018)
Derr, T., Ma, Y., Tang, J.: Signed graph convolutional networks. In: 2018 IEEE International Conference on Data Mining (ICDM), pp. 929–934. IEEE (2018)
Gong, M., Ji, S., Xie, Y., Gao, Y., Qin, A.: Exploring temporal information for dynamic network embedding. IEEE Trans. Knowl. Data Eng. 34(8), 3754–3764 (2020)
Goyal, P., Kamra, N., He, X., Liu, Y.: Dyngem: Deep embedding method for dynamic graphs. arXiv preprint arXiv:1805.11273 (2018)
Grover, A., Leskovec, J.: node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 855–864 (2016)
Heider, F.: Attitudes and cognitive organization. J. Psychol. 21(1), 107–112 (1946)
Huang, J., Shen, H., Hou, L., Cheng, X.: Signed graph attention networks. In: Tetko, I.V., Kůrková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11731, pp. 566–577. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30493-5_53
Huang, J., Shen, H., Hou, L., Cheng, X.: Sdgnn: learning node representation for signed directed networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 196–203 (2021)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Leskovec, J., Huttenlocher, D., Kleinberg, J.: Signed networks in social media. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1361–1370 (2010)
Ma, Y., Guo, Z., Ren, Z., Tang, J., Yin, D.: Streaming graph neural networks. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 719–728 (2020)
Nguyen, G.H., Lee, J.B., Rossi, R.A., Ahmed, N.K., Koh, E., Kim, S.: Continuous-time dynamic network embeddings. In: Companion Proceedings of the the Web Conference 2018, pp. 969–976 (2018)
Qiu, Z., Wu, J., Hu, W., Du, B., Yuan, G., Yu, P.: Temporal link prediction with motifs for social networks. IEEE Trans. Knowl. Data Eng. (2021)
Rossi, E., Chamberlain, B., Frasca, F., Eynard, D., Monti, F., Bronstein, M.: Temporal graph networks for deep learning on dynamic graphs. arXiv preprint arXiv:2006.10637 (2020)
Sankar, A., Wu, Y., Gou, L., Zhang, W., Yang, H.: Dysat: deep neural representation learning on dynamic graphs via self-attention networks. In: Proceedings of the 13th International Conference on Web search and Data Mining, pp. 519–527 (2020)
Sharma, K., Raghavendra, M., Lee, Y.C., Kumar M, A., Kumar, S.: Representation learning in continuous-time dynamic signed networks. In: 32nd ACM International Conference on Information and Knowledge Management (2023)
Vaswani, A., et al.: Attention is all you need. Advances in neural information processing systems 30 (2017)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Liu, Z., Wang, Y. (2024). DySDGNN: Representation Learning in Dynamic Signed Directed Networks. In: Onizuka, M., et al. Database Systems for Advanced Applications. DASFAA 2024. Lecture Notes in Computer Science, vol 14855. Springer, Singapore. https://doi.org/10.1007/978-981-97-5572-1_19
Download citation
DOI: https://doi.org/10.1007/978-981-97-5572-1_19
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-5571-4
Online ISBN: 978-981-97-5572-1
eBook Packages: Computer ScienceComputer Science (R0)