# GraRep: Learning Graph Representations with Global Structural Information

@article{Cao2015GraRepLG, title={GraRep: Learning Graph Representations with Global Structural Information}, author={Shaosheng Cao and Wei Lu and Qiongkai Xu}, journal={Proceedings of the 24th ACM International on Conference on Information and Knowledge Management}, year={2015} }

#### Figures, Tables, and Topics from this paper

#### 1,015 Citations

Representation Learning of Reconstructed Graphs Using Random Walk Graph Convolutional Network

- Computer Science
- ArXiv
- 2021

It is believed that combining high-order local structural information can more efficiently explore the potential of the network, which will greatly improve the learning efficiency of graph neural network and promote the establishment of new learning models. Expand

Representation Learning of Graphs Using Graph Convolutional Multilayer Networks Based on Motifs

- Computer Science
- Neurocomputing
- 2021

This work proposes mGCMN -- a novel framework which utilizes node feature information and the higher order local structure of the graph to effectively generate node embeddings for previously unseen data. Expand

Learning Structural Node Representations on Directed Graphs

- Computer Science
- COMPLEX NETWORKS
- 2018

Although struc2vec++ is in most cases outperformed by the competing algorithm, experiments in a variety of different scenarios demonstrate that it is much more memory efficient and it can better capture structural roles in the presence of noise. Expand

SERL: Semantic-Path Biased Representation Learning of Heterogeneous Information Network

- Computer Science
- KSEM
- 2018

The SERL method formalizes the way to fuse different semantic paths during the random walk procedure when exploring the neighborhood of corresponding node and then leverages a heterogeneous skip-gram model to perform node embeddings. Expand

Learning Graph Representation: A Comparative Study

- Computer Science
- 2018 International Arab Conference on Information Technology (ACIT)
- 2018

This paper summarizes the recent techniques and methods used for graph representation learning, and compared them together, to raise the need for comparing the existing methods in terms of methodology and techniques. Expand

GRAPHSAD: LEARNING GRAPH REPRESENTATIONS

- 2020

Graph Neural Networks (GNNs) learn effective node/graph representations by aggregating the attributes of neighboring nodes, which commonly derives a single representation mixing the information of… Expand

A Time-Aware Inductive Representation Learning Strategy for Heterogeneous Graphs

- 2019

Graphs are versatile data structures that have permeated a large number of application fields, such as biochemistry, knowledge graphs, and social networks. As a result, different graph representation… Expand

A Structural Graph Representation Learning Framework

- Computer Science
- WSDM
- 2020

This work formulate higher-order network representation learning and describes a general framework called HONE for learning such structural node embeddings from networks via the subgraph patterns (network motifs, graphlet orbits/positions) in a nodes neighborhood. Expand

Walklets: Multiscale Graph Embeddings for Interpretable Network Classification

- Computer Science, Physics
- ArXiv
- 2016

These representations clearly encode multiscale vertex relationships in a continuous vector space suitable for multi-label classification problems and outperforms new methods based on neural matrix factorization, and can scale to graphs with millions of vertices and edges. Expand

Deep Neural Networks for Learning Graph Representations

- Computer Science
- AAAI
- 2016

A novel model for learning graph representations, which generates a low-dimensional vector representation for each vertex by capturing the graph structural information directly, and which outperforms other stat-of-the-art models in such tasks. Expand

#### References

SHOWING 1-10 OF 35 REFERENCES

LINE: Large-scale Information Network Embedding

- Computer Science
- WWW
- 2015

A novel network embedding method called the ``LINE,'' which is suitable for arbitrary types of information networks: undirected, directed, and/or weighted, and optimizes a carefully designed objective function that preserves both the local and global network structures. Expand

DeepWalk: online learning of social representations

- Computer Science
- KDD
- 2014

DeepWalk is an online learning algorithm which builds useful incremental results, and is trivially parallelizable, which make it suitable for a broad class of real world applications such as network classification, and anomaly detection. Expand

Distributed large-scale natural graph factorization

- Computer Science
- WWW '13
- 2013

This work proposes a novel factorization technique that relies on partitioning a graph so as to minimize the number of neighboring vertices rather than edges across partitions, and decomposition is based on a streaming algorithm. Expand

Relational learning via latent social dimensions

- Computer Science
- KDD
- 2009

This work proposes to extract latent social dimensions based on network information, and then utilize them as features for discriminative learning, and outperforms representative relational learning methods based on collective inference, especially when few labeled data are available. Expand

Learning Deep Representations for Graph Clustering

- Computer Science
- AAAI
- 2014

This work proposes a simple method, which first learns a nonlinear embedding of the original graph by stacked autoencoder, and then runs k-means algorithm on the embedding to obtain clustering result, which significantly outperforms conventional spectral clustering. Expand

ArnetMiner: extraction and mining of academic social networks

- Computer Science
- KDD
- 2008

The architecture and main features of the ArnetMiner system, which aims at extracting and mining academic social networks, are described and a unified modeling approach to simultaneously model topical aspects of papers, authors, and publication venues is proposed. Expand

GloVe: Global Vectors for Word Representation

- Computer Science
- EMNLP
- 2014

A new global logbilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods and produces a vector space with meaningful substructure. Expand

Visualizing Data using t-SNE

- Mathematics
- 2008

We present a new technique called “t-SNE” that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map. The technique is a variation of Stochastic… Expand

Distributed Representations of Words and Phrases and their Compositionality

- Computer Science, Mathematics
- NIPS
- 2013

This paper presents a simple method for finding phrases in text, and shows that learning good vector representations for millions of phrases is possible and describes a simple alternative to the hierarchical softmax called negative sampling. Expand

Neural Word Embedding as Implicit Matrix Factorization

- Computer Science, Mathematics
- NIPS
- 2014

It is shown that using a sparse Shifted Positive PMI word-context matrix to represent words improves results on two word similarity tasks and one of two analogy tasks, and conjecture that this stems from the weighted nature of SGNS's factorization. Expand