Pattern vectors from algebraic graph theory

Article English OPEN
Wilson, R C ; Hancock, E R ; Luo, B (2005)
  • Subject:
    acm: ComputerApplications_COMPUTERSINOTHERSYSTEMS
    arxiv: Computer Science::Databases

Graphstructures have proven computationally cumbersome for pattern analysis. The reason for this is that, before graphs can be converted to pattern vectors, correspondences must be established between the nodes of structures which are potentially of different size. To overcome this problem, in this paper, we turn to the spectral decomposition of the Laplacian matrix. We show how the elements of the spectral matrix for the Laplacian can be used to construct symmetric polynomials that are permutation invariants. The coefficients of these polynomials can be used as graph features which can be encoded in a vectorial manner. We extend this representation to graphs in which there are unary attributes on the nodes and binary attributes on the edges by using the spectral decomposition of a Hermitian property matrix that can be viewed as a complex analogue of the Laplacian. To embed the graphs in a pattern space, we explore whether the vectors of invariants can be embedded in a low- dimensional space using a number of alternative strategies, including principal components analysis ( PCA), multidimensional scaling ( MDS), and locality preserving projection ( LPP). Experimentally, we demonstrate that the embeddings result in well- defined graph clusters. Our experiments with the spectral representation involve both synthetic and real- world data. The experiments with synthetic data demonstrate that the distances between spectral feature vectors can be used to discriminate between graphs on the basis of their structure. The real- world experiments show that the method can be used to locate clusters of graphs.
  • References (39)
    39 references, page 1 of 4

    [1] A.D. Bagdanov and M. Worring, “First Order Gaussian Graphs for Efficient Structure Classification,” Pattern Recogntion, vol. 36, pp. 1311-1324, 2003.

    [2] M. Belkin and P. Niyogi, “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation,” Neural Computation, vol. 15, no. 6, pp. 1373-1396, 2003.

    [3] N. Biggs, Algebraic Graph Theory. Cambridge Univ. Press, 1993.

    [4] P. Botti and R. Morris, “Almost All Trees Share a Complete Set of Inmanantal Polynomials,” J. Graph Theory, vol. 17, pp. 467-476, 1993.

    [5] W.J. Christmas, J. Kittler, and M. Petrou, “Structural Matching in Computer Vision Using Probabilistic Relaxation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, pp. 749-764, 1995.

    [6] F.R.K. Chung, “Spectral Graph Theory,” Am. Math. Soc. Edition., CBMS series 92, 1997.

    [7] T. Cox and M. Cox, Multidimensional Scaling. Chapman-Hall, 1994.

    [8] D. Cvetkovic, P. Rowlinson, and S. Simic, Eigenspaces of Graphs. Cambridge Univ. Press, 1997.

    [9] S. Gold and A. Rangarajan, “A Graduated Assignment Algorithm for Graph Matching,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, pp. 377-388, 1996.

    [10] X. He and P. Niyogi, “Locality Preserving Projections,” Advances in Neural Information Processing Systems 16, MIT Press, 2003.

  • Metrics
    No metrics available
Share - Bookmark