
We provide a review of the evolution of value of Page Rank to Random Walk with Random Restart and it's application to neural networks focusing on five research papers dating from the original page rank to 2025. They collectively focus on methods for learning on graphs, particularly through the use of **Random Walk Neural Networks (RWNNs)** and related random walk algorithms. One primary source introduces RWNNs, detailing their architecture, which involves a random walk generating a machine-readable record processed by a deep neural network, demonstrating that these models can achieve **universal approximation of graph functions** and overcome issues like over-smoothing found in Message Passing Neural Networks (MPNNs). This source also explores techniques like **anonymization** and **named neighbors** for walk recording and includes experimental results on graph isomorphism and transductive classification using language models like DeBERTa and Llama 3. The other sources provide brief contextual support, mentioning **Random Walk with Restart (RWR)** parameters and evaluation criteria like **Relative Accuracy** and **Relative Score** for related graph applications and datasets, suggesting connections to established graph algorithms such as PageRank.
Sources:
2025:
REVISITING RANDOM WALKS FOR LEARNING ON GRAPHS
https://proceedings.iclr.cc/paper_files/paper/2025/file/cd51b67dcb19db4e9f0022f500076b00-Paper-Conference.pdf
October 3, 2022:
Universal Multilayer Network Exploration by
Random Walk with Restart
https://arxiv.org/pdf/2107.04565
2020:
Random Walk Graph Neural Networks
https://proceedings.neurips.cc/paper/2020/file/ba95d78a7c942571185308775a97a3a0-Paper.pdf
2006:
Fast Random Walk with Restart and Its Applications
https://www.cs.cmu.edu/~htong/pdf/ICDM06_tong.pdf
January 29, 1998:
The Page Rank Citation Ranking: Bringing Order to the Web
https://www.cis.upenn.edu/~mkearns/teaching/NetworkedLife/pagerank.pdf