Signal processing on higher-order networks: Livin’ on the edge... and beyond
Keywords
1. Introduction
2. Signal processing on graphs: a selective overview
2.1. Central tenets of discrete signal processing
2.2. Graphs, incidence matrices, and the graph Laplacian
2.3. Graph signal processing
Fig. 1. Graph signal and its Fourier decomposition. A Graph signal defined on the nodes of the graph. B Eigenvector and eigenvalue pairs of the graph Laplacian . We visualize each of the eigenvectors in terms of a graph signal and order them from low to high graph frequencies, corresponding to a decrease in “smoothness”. The decomposition of the node signal into this basis provides the Fourier coefficients in as indicated at the bottom of each eigenvector representation.
2.4. Graph signal processing: illustrative problems and applications
2.4.1. Fourier analysis: node embeddings and Laplacian eigenmaps
2.4.2. Signal smoothing and denoising
2.4.3. Graph signal interpolation
2.4.4. Graph neural networks
3. Modeling higher-order interactions with simplicial complexes
3.1. Background on simplicial complexes
Example 1
Example 2
3.2. The Hodge Laplacian as a shift operator for simplicial complexes
Example 3
Fig. 2. Signals on simplicial complexes of different order. A: Structure of the simplicial complexes used as a running example in the text. Arrows represent the chosen reference orientation. Shaded areas correspond to the 2-simplices and . B: Signal on 0-simplices (nodes). C: Signal on 1-simplices (edges). D: Signal on 2-simplices (triangles).
Fig. 3. Hodge decomposition of the edge flow in the example from Fig. 2. Any edge flow (left) can be decomposed into a harmonic flow, a gradient flow and a curl flow.
Lemma 4
- •Consider any eigenvector of the graph Laplacian associated with a nonzero eigenvalue . Then is an eigenvector of with the same eigenvalue . Moreover spans the space of all gradient flows.
- •Consider any eigenvector of the “2-simplex coupling matrix” associated with a nonzero eigenvalue . Then is an eigenvector of with the same eigenvalue . Moreover spans the space of all curl flows.
4. Signal processing and learning on simplicial complexes
4.1. Fourier analysis: edge-flow and trajectory embeddings
Example 5
Fig. 4. Embedding of trajectories defined on a simplicial complex. A Five trajectories defined on a simplicial complex containing two obstacles, indicated by orange color. The simplicial complex is constructed by creating a triangular lattice from a random set of points and then introducing two “holes” in this lattice. All triangles in the lattices are assumed to correspond to 2-simplices. B The projection of the trajectories displayed in A into the two dimensional harmonic space of the simplicial complex. Notice that the trajectories that move around the obstacles in a topologically similar way have a similar embedding [49].
4.2. Flow smoothing and denoising
Example 6
Fig. 5. Flow smoothing on a graph. A An undirected graph with a pre-defined and oriented flow . B The observed flow is a noisy version of the flow , i.e., is distorted by a Gaussian white noise vector . C We denoise the flow by applying a Laplacian filter based on the line-graph. This filter performs worse compared to the edge space filters in D and E that account for flow conservation. D Denoised flow obtained after applying the filter based on the edge Laplacian. E Denoised flow obtained after applying the filter based on the Hodge Laplacian. The estimation error is lower than in the edge Laplacian case as the filter accounts for filled faces in the graph.
4.3. Interpolation and semi-supervised learning
Example 7
Fig. 6. Semi-supervised learning for edge flow. A Synthetic flow. 50% of the edges are labeled. Labeled edges are colored based on the value of their flow. The remaining edges in grey are inferred from the procedure explained in the text. B Edge flow obtained after applying the semi-supervised algorithm in (17). C Numerical value of the inferred signal.
Remark 8
4.4. Beyond linear filters: simplicial neural networks and Hodge theory
- Permutation equivariance. Although the nodes are given labels and an ordering for notational convenience, graph neural networks are not dependent on the chosen labeling of the nodes. That is, if the node and corresponding input labels were permuted in some way, the output of the graph neural network, modulo said permutation, will not change.
- Locality. Graph neural networks in their most basic form operate locally in the graph structure. Typically, at each layer a node’s representation is affected only by its own state and the state of its immediate neighbors. Forcing operations to occur locally is how the underlying graph structure is used to regularize the functional form of the graph neural network.
4.4.1. Simplicial graph neural networks
- Orientation equivariance. If the chosen arbitrary reference orientation of the simplices in is changed, the output of the neural network architecture remains the same, modulo said change in orientation.
- Simplicial locality. At each layer of an architecture with simplicial locality, information exchange only occurs between adjacent levels of the underlying simplicial complex, i.e., the output of a layer restricted to simplices is dependent only on the input of that layer restricted to simplices.
- Extended simplicial locality. For an architecture with extended simplicial locality, the output restricted to simplices is dependent on the input restricted to simplices at all levels, not just those of order .
5. Modeling higher-order interactions via hypergraphs
Example 9
- (1)Heterogeneous hypergraphs refer to hypergraphs containing different types of vertices and/or different types of hyperedges [81], [82], [83], [84] and may thus be seen as a generalization of multilayer and multiplex networks. For example, in a GPS network [85], a hyperedge can have three types of vertices (user, location, activity). Another example is online social networks such as Twitter, in which we can have different types of vertices including users, tweets, usertags, hashtags and groups as well as multiple types of hyperedges such as ‘users release tweets containing hashtags or not’, ‘users join groups’, and ‘users assign usertags to themselves’ [86].
- (2)Edge-dependent vertex weights are introduced into hypergraphs in [73], [75], [76] to reflect the different contribution (e.g., importance or influence) of vertices in the same hyperedge. More precisely, for each hyperedge , a function is defined to assign positive weights to vertices in this hyperedge. For instance, in the co-authorship network in Example 9, the different levels of contribution of the authors of a paper can be encoded as edge-dependent vertex weights. If for every vertex and every pair of hyperedges and containing , then we say that the vertex weights are edge-independent. Such hypergraphs are also called vertex-weighted hypergraphs [87]. Moreover, if for all vertices and incident hyperedges , the vertex weights are trivial and we recover the homogeneous hypergraph model.
- (3)In order to leverage the fact that different subsets of vertices in one hyperedge may have different structural importance, the concept of an inhomogeneous hyperedge is proposed in [88]. Each inhomogeneous hyperedge is associated with a function that assigns non-negative costs to different cuts of the hyperedge, where 2 denotes the power set of . The weight indicates the cost of partitioning the hyperedge into two subsets and . This is called a submodular hypergraph when satisfies submodularity constraints [89].
5.1. Matrix-based hypergraph representations
Fig. 7. Different transformations on an example hypergraph. A The original hypergraph. B The dual hypergraph. C The clique expansion. D The star expansion. E The line graph. F The line expansion.
5.2. Tensor-based hypergraph representations
Example 10
Example 11
Fig. 8. Tensor based shift operator on a hypergraph. The output of at vertex is determined by a weighted sum over the hyperedges incident to , where the summands correspond to the products of the vertex signals within the respective hyperedges excluding . [Figure adapted from Fig. 10(a) in [112]].
5.3. Comparison between matrix-based and tensor-based hypergraph representations
Remark 12
6. Signal processing and learning on hypergraphs
6.1. Fourier analysis, node and hyperedge embeddings
6.2. Signal smoothing and denoising
6.3. Signal interpolation on hypergraphs
6.4. Hypergraph neural networks
7. Discussion
Declaration of Competing Interest
Acknowledgements
References
- [1]NetworksOxford University Press (2018)
- [2]Networks, Crowds, and Marketsvolume 8, Cambridge University Press (2010)
- [3]Graph theory methods: applications in brain networksDialog. Clin. Neurosci., 20 (2) (2018), pp. 111-121
- [4]Brain network efficiency is influenced by the pathologic source of corticobasal syndromeNeurology, 89 (13) (2017), pp. 1373-1381
- [5]Applications of graph theory and network science to transit network designTransp. Rev., 31 (4) (2011), pp. 495-519
- [6]Network analysis in the social sciencesScience, 323 (892) (2009)
- [7]Discrete signal processing on graphsIEEE Trans. Signal Process., 61 (7) (2013), pp. 1644-1656
- [8]The emerging field of signal processing on graphs: extending high-dimensional data analysis to networks and other irregular domainsIEEE Signal Process. Mag., 30 (7) (2013), pp. 83-98
- [9]Graph signal processing: overview, challenges and applicationsProc. IEEE, 106 (5) (2018), pp. 808-828
- [10]Optimal graph-filter design and applications to distributed linear network operatorsIEEE Trans. Signal Process., 65 (15) (2017), pp. 4117-4131
- [11]Spectral Graph TheoryAmerican Mathematical Society (1997)
- [12]Two’s company, three (or more) is a simplex: Algebraic-topological tools for understanding higher-order structure in neural dataJ. Comput. Neurosci., 41 (2016), pp. 1-14
- [13]Hypergraphs and cellular networksPLoS Comput. Biol., 5 (2009)
- [14]Social groups, social media, and higher-dimensional social structures: a simplicial model of social aggregation for computational communication researchCommun. Q., 61 (2013), pp. 35-68
- [15]Algebraic TopologyCambridge University Press (2002)
- [16]HypergraphsElsevier (1989)
- [17]Extremal Set SystemsMIT Press, Cambridge, MA, USA (1996), pp. 1293-1329
- [18]Topological Signal Processing81, Springer (2014)
- [19]Signal processing on directed graphs: the role of edge directionality when processing and learning from network dataIEEE Signal Process. Mag., 37 (6) (2020), pp. 99-116
- [20]Graph signal processing for directed graphs based on the hermitian LaplacianU. Brefeld, E. Fromont, A. Hotho, A. Knobbe, M. Maathuis, C. Robardet (Eds.), Machine Learning and Knowledge Discovery in Databases, Springer International Publishing, Cham (2020), pp. 447-463
- [21]Discrete-Time Signal Processing(third ed.), Prentice Hall Press, USA (2009)
- [22]Sampling of graph signals with successive local aggregationsIEEE Trans. Signal Process., 64 (7) (2016), pp. 1832-1843
- [23]Discrete signal processing on graphs: Sampling theoryIEEE Trans. Signal Process., 63 (24) (2015), pp. 6510-6523
- [24]Efficient sampling set selection for bandlimited graph signals using graph spectral proxiesIEEE Trans. Signal Process., 64 (14) (2016), pp. 3775-3789
- [25]Blind identification of graph filtersIEEE Trans. Signal Process., 65 (5) (2017), pp. 1146-1159
- [26]Estimating network processes via blind identification of multiple graph filtersIEEE Trans. Signal Process., 68 (2020), pp. 3049-3063
- [27]Learning Laplacian matrix in smooth graph signal representationsIEEE Trans. Signal Process., 64 (23) (2016), pp. 6160-6173
- [28]Network topology inference from spectral templatesIEEE Trans. Signal Inf. Process. Netw., 3 (3) (2017), pp. 467-483
- [29]Kernel-based structural equation models for topology identification of directed networksIEEE Trans. Signal Process., 65 (10) (2017), pp. 2503-2516
- [30]Connecting the dots: Identifying network structure via graph signal processingIEEE Signal Process. Mag., 36 (3) (2019), pp. 16-43
- [31]A unified view of diffusion maps and signal processing on graphsSamp. Theory and Appl. (SampTA), IEEE (2017), pp. 308-312
- [32]Laplacian eigenmaps for dimensionality reduction and data representationNeural Comput., 15 (6) (2003), pp. 1373-1396
- [33]Diffusion mapsAppl. Comput. Harmonic Anal., 21 (1) (2006), pp. 5-30
- [34]W.L. Hamilton, R. Ying, J. Leskovec, Representation learning on graphs: methods and applications, arXiv Preprint (2017). 1709.05584
- [35]Signal denoising on graphs via graph filteringIEEE Global Conf. Signal and Info. Process. (GlobalSIP) (2014), pp. 872-876
- [36]A regularization framework for learning from graph dataICML Workshop on Statistical Relational Learning and Its Connections to Other Fields (2004), pp. 132-137
- [37]Reconstruction of graph signals through percolation from seeding nodesIEEE Trans. Signal Process., 64 (16) (2016), pp. 4363-4378
- [38]Cluster kernels for semi-supervised learningAdv. Neural Info. Process. Syst. (NeurIPS), NIPS’02, MIT Press, Cambridge, MA, USA (2002), pp. 601-608
- [39]Semi-supervised learning using Gaussian fields and harmonic functionsIntl. Conf. Mach. Learn. (ICML), AAAI Press (2003), pp. 912-919
- [40]Geometric deep learning: going beyond Euclidean dataIEEE Signal Process. Mag., 34 (4) (2017), pp. 18-42
- [41]A comprehensive survey on graph neural networksIEEE Trans. Neural Netw. Learn. Syst. (2020)
- [42]Deep neural networks for learning graph representations.AAAI Conf. on Artif. Intell. (AAAI), 16 (2016), pp. 1145-1152
- [43]Structural deep network embeddingACM Intl. Conf. on Know. Disc. and Data Mining (SIGKDD) (2016), pp. 1225-1234
- [44]Variational graph auto-encodersNeurIPS Bayesian Deep Learning Workshop (2016)
- [45]Semi-supervised classification with graph convolutional networksIntl. Conf. Learn. Repres. (ICLR) (2017)
- [46]Convolutional neural networks on graphs with fast localized spectral filteringAdv. Neural Inf. Process. Syst., 29 (2016), pp. 3844-3852
- [47]Convolutional neural network architectures for signals supported on graphsIEEE Trans. Signal Process., 67 (4) (2018), pp. 1034-1049
- [48]Hodge Laplacians on graphsSIAM Rev., 62 (3) (2020), pp. 685-715
- [49]Random walks on simplicial complexes and the normalized Hodge 1-LaplacianSIAM Rev., 62 (2020), pp. 353-391
- [50]Discrete Calculus: Applied Analysis on Graphs for Computational ScienceSpringer-Verlag (2010)
- [51]TopologyPrentice Hall (2000)
- [52]Harmonische funktionen and randwertaufgaben in einem komplexCommentarii Math. Helvetici, 17 (1944), pp. 240-255
- [53]Simplicial closure and higher-order link predictionProc. Natl. Acad. Sci., 115 (48) (2018), pp. E11221-E11230
- [54]Flow smoothing and denoising: graph signal processing in the edge-spaceIEEE Global Conf. Signal and Info. Process. (GlobalSIP) (2018), pp. 735-739
- [55]Graph-based semi-supervised & active learning for edge flowsACM Intl. Conf. on Know. Disc. and Data Mining (SIGKDD) (2019), pp. 761-771
- [56]Topological signal processing over simplicial complexesIEEE Trans. Signal Process., PP (2020)1–1
- [57]A tutorial on spectral clusteringStat. Comp., 17 (2007), pp. 395-416
- [58]A notion of harmonic clustering in simplicial complexesIEEE Intl. Conf. on Mach. Learn. and its Appl. (ICMLA), IEEE (2019), pp. 1083-1090
- [59]Topological signatures for fast mobility analysisACM Intl. Conf. on Adv. in Geo. Inf. Syst. (SIGSPATIAL) (2018), pp. 159-168
- [60]Statistical ranking and combinatorial Hodge theoryMath. Prog., 127 (1) (2011), pp. 203-244
- [61]Learning from signals defined over simplicial complexesIEEE Data Sci. Wrksp. (DSW) (2018), pp. 51-55
- [62]M. Yang, E. Isufi, M.T. Schaub, G. Leus, Finite impulse response filters for simplicial complexes, arXiv preprint arXiv:2103.12587(2021).
- [63]Topological signal processing: making sense of data building on multiway relationsIEEE Signal Process. Mag., 37 (6) (2020), pp. 174-183
- [64]Stability properties of graph neural networksIEEE Trans. Signal Process., 68 (2020), pp. 5680-5695
- [65]Simplicial neural networksNeurIPS Workshop on Topological Data Analysis and Beyond (2020)
- [66]Simplicial 2-complex convolutional neural networksNeurIPS Workshop on Topological Data Analysis and Beyond (2020)
- [67]N. Glaze, T.M. Roddenberry, S. Segarra, Principled simplicial neural networks for trajectory prediction, arXiv preprint arXiv:2102.10058(2021).
- [68]C. Bodnar, F. Frasca, Y.G. Wang, N. Otter, G. Montúfar, P. Lió, M. Bronstein, Weisfeiler and Lehman go topological: message passing simplicial networks, arXiv:2103.03212 (2021).
- [69]HodgeNet: graph neural networks for edge dataAsilomar Conf. Signals, Systems, and Computers (2019), pp. 220-224
- [70]L. Neuhäuser, M.T. Schaub, A. Mellor, R. Lambiotte, Opinion dynamics with multi-body interactions, arXiv Preprint(2020a). 2004.00901
- [71]Multibody interactions and nonlinear consensus dynamics on networked systemsPhys. Rev. E, 101 (3) (2020), p. 032310
- [72]Understanding importance of collaborations in co-authorship networks: a supportiveness analysis approachSIAM Intl. Conf. on Data Mining (2009), pp. 1112-1123
- [73]Random walks on hypergraphs with edge-dependent vertex weightsIntl. Conf. Mach. Learn. (ICML) (2019)
- [74]Anomaly detection using scan statistics on time series hypergraphsLink Analysis, Counterterrorism and Security (LACTS) Conference (2009), p. 9
- [75]Hypergraph random walks, Laplacians, and clusteringACM Conf. on Inf. and Knowl. Management (CIKM) (2020), pp. 495-504
- [76]Y. Zhu, B. Li, S. Segarra, Co-clustering vertices and hyperedges via spectral hypergraph partitioning, arXiv preprint arXiv:2102.10169(2021).
- [77]The human disease networkProc. Natl. Acad. Sci., 104 (21) (2007), pp. 8685-8690
- [78]Hypernetwork science via high-order hypergraph walksEPJ Data Sci., 9 (1) (2020), p. 16
- [79]E-tail product return prediction via hypergraph-based local graph cutACM Intl. Conf. on Know. Disc. and Data Mining (SIGKDD) (2018), pp. 519-527
- [80]Directed hypergraphs and applicationsDiscrete Applied Mathematics, 42 (2–3) (1993), pp. 177-201
- [81]Heterogeneous hyper-network embeddingIEEE Intl. Conf. on Data Mining (ICDM) (2018), pp. 875-880
- [82]Structural deep embedding for hyper-networksAAAI Conf. on Artif. Intell. (AAAI) (2018)
- [83]Hyper-SAGNN: a self-attention based graph neural network for hypergraphsIntl. Conf. Learn. Repres. (ICLR) (2020)
- [84]Heterogeneous hypergraph embedding for graph classificationACM Intl. Conf. on Web Search and Data Mining (WSDM) (2021)
- [85]Collaborative filtering meets mobile recommendation: a user-centered approachAAAI Conf. on Artif. Intell. (AAAI) (2010), pp. 236-241
- [86]Link prediction in social networks based on hypergraphProceedings of the 22nd International Conference on World Wide Web (2013), pp. 41-42
- [87]Vertex-weighted hypergraph learning for multi-view object classificationIntl. Joint Conf. on Artif. Intell. (IJCAI) (2017), pp. 2779-2785
- [88]Inhomogeneous hypergraph clustering with applicationsAdv. Neural Info. Process. Syst. (NeurIPS) (2017), pp. 2308-2318
- [89]Submodular hypergraphs: p-Laplacians, Cheeger inequalities and spectral clusteringIntl. Conf. Mach. Learn. (ICML) (2018)
- [90]Tensor decompositions and applicationsSIAM Rev., 51 (3) (2009), pp. 455-500
- [91]Tensor decompositions for signal processing applications: from two-way to multiway component analysisIEEE Signal Process. Mag., 32 (2) (2015), pp. 145-163
- [92]Tensor decomposition for signal processing and machine learningIEEE Trans. Signal Process., 65 (13) (2017), pp. 3551-3582
- [93]Higher order learning with graphsIntl. Conf. Mach. Learn. (ICML) (2006), pp. 17-24
- [94]Learning with hypergraphs: clustering, classification, and embeddingAdv. Neural Info. Process. Syst. (NeurIPS), 19 (2006), pp. 1601-1608
- [95]Spectra of regular graphs and hypergraphs and orthogonal polynomialsEur. J. Combin., 17 (5) (1996), pp. 461-477
- [96]Spectra, Euclidean representations and clusterings of hypergraphsDiscret. Math., 117 (1–3) (1993), pp. 19-39
- [97]On the Laplacian eigenvalues and metric parameters of hypergraphsLinear and Multilinear Algebra, 50 (1) (2002), pp. 1-14
- [98]On the Laplacian spectrum and walk-regular hypergraphsLinear Multilinear Algebra, 51 (3) (2003), pp. 285-297
- [99]Clustering categorical data: an approach based on dynamical systemsVLDB J., 8 (3–4) (2000), pp. 222-236
- [100]S. Bandyopadhyay, K. Das, M.N. Murty, Line hypergraph convolution network: applying graph convolution for hypergraphs, arXiv Preprint (2020). 2002.03392
- [101]C. Yang, R. Wang, S. Yao, T. Abdelzaher, Hypergraph learning with line expansion, arXiv Preprint (2020). 2005.04843
- [102]Laplacians and the Cheeger inequality for directed graphsAnn. Combin., 9 (1) (2005), pp. 1-19
- [103]Alignment and integration of complex networks by hypergraph-based spectral clusteringPhys. Rev. E, 86 (5) (2012), p. 056111
- [104]Uniform hypergraph partitioning: provable tensor methods and sampling techniquesJ. Mach. Learn., 18 (1) (2017), pp. 1638-1678
- [105]Spectra of uniform hypergraphsLinear Algebra Appl., 436 (9) (2012), pp. 3268-3292
- [106]The Laplacian of a uniform hypergraphJ. Combin. Optim., 29 (2) (2015), pp. 331-366
- [107]Spectra of general hypergraphsLinear Algebra Appl., 518 (2017), pp. 14-30
- [108]X. Ouvrard, J.-M. L. Goff, S. Marchand-Maillet, Adjacency and tensor representation in general hypergraphs part 1: e-adjacency tensor uniformisation using homogeneous polynomials, arXiv Preprint (2017). 1712.08189
- [109]Spectral hypergraph theory of the adjacency hypermatrix and matroidsLinear Algebra Appl., 465 (2015), pp. 176-187
- [110]Tensor spectral clustering for partitioning higher-order network structuresSIAM Intl. Conf. on Data Mining, SIAM (2015), pp. 118-126
- [111]Three hypergraph eigenvector centralitiesSIAM J. Math. Data Sci., 1 (2) (2019), pp. 293-312
- [112]Introducing hypergraph signal processing: theoretical foundation and practical applicationsIEEE Int. Things J., 7 (1) (2019), pp. 639-660
- [113]Eigenvalues of a real supersymmetric tensorJ. Symbol. Comput., 40 (6) (2005), pp. 1302-1324
- [114]Autoregressive moving average graph filteringIEEE Trans. Signal Process., 65 (2) (2017), pp. 274-288
- [115]Tensor rank is NP-completeInternational Colloquium on Automata, Languages, and Programming, Springer (1989), pp. 451-460
- [116]Hypergraph Markov operators, eigenvalues and approximation algorithmsProceedings of the forty-seventh annual ACM symposium on Theory of computing (2015), pp. 713-722
- [117]Spectral properties of hypergraph laplacian and approximation algorithmsJ. ACM, 65 (3) (2018), pp. 1-48
- [118]Generalizing the hypergraph Laplacian via a diffusion process with mediatorsTheor. Comput. Sci., 806 (2020), pp. 416-428
- [119]Cheeger inequalities for submodular transformationsProceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms (2019), pp. 2582-2601
- [120]CP-ORTHO: an orthogonal tensor factorization framework for spatio-temporal dataACM Intl. Conf. on Adv. in Geo. Inf. Syst. (SIGSPATIAL) (2017), pp. 1-4
- [121]A. Sharma, S. Joty, H. Kharkwal, J. Srivastava, Hyperedge2vec: distributed representations for hyperedges, 2018, (http://mesh.cs.umn.edu/papers/hyp2vec.pdf).
- [122]Symmetric tensors and symmetric tensor rankSIAM J. Matrix Anal. Appl., 30 (3) (2008), pp. 1254-1279
- [123]Numerical optimization for symmetric tensor decompositionMath. Prog., 151 (1) (2015), pp. 225-248
- [124]The total variation on hypergraphs-learning on hypergraphs revisitedAdv. Neural Info. Process. Syst. (NeurIPS) (2013), pp. 2427-2435
- [125]Efficient minimization of decomposable submodular functionsAdv. Neural Info. Process. Syst. (NeurIPS) (2010), pp. 2208-2216
- [126]Reflection methods for user-friendly submodular optimizationAdv. Neural Info. Process. Syst. (NeurIPS) (2013), pp. 1313-1321
- [127]On the convergence rate of decomposable submodular function minimizationAdv. Neural Info. Process. Syst. (NeurIPS) (2014), pp. 640-648
- [128]Random coordinate descent methods for minimizing decomposable submodular functionsIntl. Conf. Mach. Learn. (ICML) (2015), pp. 787-795
- [129]Decomposable submodular function minimization discrete and continuousAdv. Neural Info. Process. Syst. (NeurIPS) (2017), pp. 2874-2884
- [130]Revisiting decomposable submodular function minimization with incidence relationsAdv. Neural Info. Process. Syst. (NeurIPS) (2018), pp. 2242-2252
- [131]Quadratic decomposable submodular function minimization: theory and practiceJ. Mach. Learn., 21 (106) (2020), pp. 1-49
- [132]K. Fujii, T. Soma, Y. Yoshida, Polynomial-time algorithms for submodular laplacian systems, arXiv preprint arXiv:1803.10923(2018).
- [133]Hypergraph neural networksAAAI Conf. on Artif. Intell. (AAAI), 33 (2019), pp. 3558-3565
- [134]HyperGCN: a new method for training graph convolutional networks on hypergraphsAdv. Neural Info. Process. Syst. (NeurIPS) (2019), pp. 1511-1522
- [135]D. Arya, D.K. Gupta, S. Rudinac, M. Worring, HyperSAGE: Generalizing inductive representation learning on hypergraphs, arXiv Preprint (2020). 2010.04558
- [136]Inductive representation learning on large graphsAdv. Neural Info. Process. Syst. (NeurIPS) (2017), pp. 1024-1034
- [137]Dynamic hypergraph neural networks.Intl. Joint Conf. on Artif. Intell. (IJCAI) (2019), pp. 2635-2641
- [138]S. Bai, F. Zhang, P.H. Torr, Hypergraph convolution and hypergraph attention, arXiv Preprint (2019). 1901.08150
- [139]Powerset convolutional neural networksAdv. Neural Info. Process. Syst. (NeurIPS) (2019), pp. 929-940
Cited by (119)
Sombor index and degree-related properties of simplicial networks
2022, Applied Mathematics and ComputationCitation Excerpt :Despite the success of network presentation during the past decades, the strong limitation of a single type of pairwise or dyadic interactions falls short in effectively capturing many empirical systems [5,6]. The significance of higher-order interactions has been highlighted recently in a variety of real-world systems in nature [22], biology [47] and technology [39], with examples ranging from scientific collaboration [48] to neuronal activity in brains [9], from social contagion [29] to competition and cooperation in ecosystems [30]. With higher-order interaction modeled at the level of groups of nodes, it is found that collective behaviors in neuroscience can be more faithfully predicted [21,46] and essential nonlinearity emerges in diffusion processes [34].
Graph Filters for Signal Processing and Machine Learning on Graphs
2024, IEEE Transactions on Signal ProcessingGraph Signal Processing: History, development, impact, and outlook
2023, IEEE Signal Processing MagazineGlobal Topological Synchronization on Simplicial and Cell Complexes
2023, Physical Review LettersWhat Are Higher-Order Networks?
2023, SIAM Review
- 1
- Equal contribution