Пожалуйста, используйте этот идентификатор, чтобы цитировать или ссылаться на этот ресурс: http://elar.urfu.ru/handle/10995/130549
Название: Convolutional Neural Network Outperforms Graph Neural Network on the Spatially Variant Graph Data
Авторы: Boronina, A.
Maksimenko, V.
Hramov, A. E.
Дата публикации: 2023
Издатель: MDPI
Библиографическое описание: Boronina, A, Maksimenko, V & Hramov, AE 2023, 'Convolutional Neural Network Outperforms Graph Neural Network on the Spatially Variant Graph Data', Mathematics, Том. 11, № 11, 2515. https://doi.org/10.3390/math11112515
Boronina, A., Maksimenko, V., & Hramov, A. E. (2023). Convolutional Neural Network Outperforms Graph Neural Network on the Spatially Variant Graph Data. Mathematics, 11(11), [2515]. https://doi.org/10.3390/math11112515
Аннотация: Applying machine learning algorithms to graph-structured data has garnered significant attention in recent years due to the prevalence of inherent graph structures in real-life datasets. However, the direct application of traditional deep learning algorithms, such as Convolutional Neural Networks (CNNs), is limited as they are designed for regular Euclidean data like 2D grids and 1D sequences. In contrast, graph-structured data are in a non-Euclidean form. Graph Neural Networks (GNNs) are specifically designed to handle non-Euclidean data and make predictions based on connectivity rather than spatial structure. Real-life graph data can be broadly categorized into two types: spatially-invariant graphs, where the link structure between nodes is independent of their spatial positions, and spatially-variant graphs, where node positions provide additional information about the graph’s properties. However, there is limited understanding of the effect of spatial variance on the performance of Graph Neural Networks. In this study, we aim to address this issue by comparing the performance of GNNs and CNNs on spatially-variant and spatially-invariant graph data. In the case of spatially-variant graphs, when represented as adjacency matrices, they can exhibit Euclidean-like spatial structure. Based on this distinction, we hypothesize that CNNs may outperform GNNs when working with spatially-variant graphs, while GNNs may excel on spatially-invariant graphs. To test this hypothesis, we compared the performance of CNNs and GNNs under two scenarios: (i) graphs in the training and test sets had the same connectivity pattern and spatial structure, and (ii) graphs in the training and test sets had the same connectivity pattern but different spatial structures. Our results confirmed that the presence of spatial structure in a graph allows for the effective use of CNNs, which may even outperform GNNs. Thus, our study contributes to the understanding of the effect of spatial graph structure on the performance of machine learning methods and allows for the selection of an appropriate algorithm based on the spatial properties of the real-life graph dataset. © 2023 by the authors.
Ключевые слова: ADJACENCY MATRIX
CLASSIFICATION
CLUSTERING
CONVOLUTIONAL NEURAL NETWORK (CNN)
GRAPH NEURAL NETWORK (GNN)
GRAPH STRUCTURES
MODULARITY
SEGREGATION
SPATIAL INVARIANCE
URI: http://elar.urfu.ru/handle/10995/130549
Условия доступа: info:eu-repo/semantics/openAccess
cc-by
Текст лицензии: https://creativecommons.org/licenses/by/4.0/
Идентификатор SCOPUS: 85161461992
Идентификатор WOS: 001006288200001
Идентификатор PURE: 40606100
ISSN: 2227-7390
DOI: 10.3390/math11112515
Сведения о поддержке: Ministry of Education and Science of the Russian Federation, Minobrnauka: NSH-589.2022.1.2
The research funding from the Ministry of Science and Higher Education of the Russian Federation (Ural Federal University Program of Development within the Priority-2030 Program) is gratefully acknowledged. A.E.H. also extends thanks to support President Program for Leading Scientific School Support (grant NSH-589.2022.1.2).
Располагается в коллекциях:Научные публикации ученых УрФУ, проиндексированные в SCOPUS и WoS CC

Файлы этого ресурса:
Файл Описание РазмерФормат 
2-s2.0-85161461992.pdf3,22 MBAdobe PDFПросмотреть/Открыть


Лицензия на ресурс: Лицензия Creative Commons Creative Commons