Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
Business
Sports
TV & Film
Technology
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/92/f0/ad/92f0adf4-2b10-a63c-bc79-1889b710b139/mza_6601485165628379978.jpg/600x600bb.jpg
AI: post transformers
mcgrof
316 episodes
2 days ago
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.
Show more...
Technology
RSS
All content for AI: post transformers is the property of mcgrof and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.
Show more...
Technology
https://d3t3ozftmdmh3i.cloudfront.net/staging/podcast_uploaded_nologo/44199026/44199026-1754490757264-4f84f1d34e94a.jpg
Random Walk Methods for Graph Learning and Networks
AI: post transformers
14 minutes 57 seconds
1 week ago
Random Walk Methods for Graph Learning and Networks

We provide a review of the evolution of value of Page Rank to Random Walk with Random Restart and it's application to neural networks focusing on five research papers dating from the original page rank to 2025. They collectively focus on methods for learning on graphs, particularly through the use of **Random Walk Neural Networks (RWNNs)** and related random walk algorithms. One primary source introduces RWNNs, detailing their architecture, which involves a random walk generating a machine-readable record processed by a deep neural network, demonstrating that these models can achieve **universal approximation of graph functions** and overcome issues like over-smoothing found in Message Passing Neural Networks (MPNNs). This source also explores techniques like **anonymization** and **named neighbors** for walk recording and includes experimental results on graph isomorphism and transductive classification using language models like DeBERTa and Llama 3. The other sources provide brief contextual support, mentioning **Random Walk with Restart (RWR)** parameters and evaluation criteria like **Relative Accuracy** and **Relative Score** for related graph applications and datasets, suggesting connections to established graph algorithms such as PageRank.


Sources:


2025:

REVISITING RANDOM WALKS FOR LEARNING ON GRAPHS

https://proceedings.iclr.cc/paper_files/paper/2025/file/cd51b67dcb19db4e9f0022f500076b00-Paper-Conference.pdf


October 3, 2022:

Universal Multilayer Network Exploration by

Random Walk with Restart

https://arxiv.org/pdf/2107.04565


2020:

Random Walk Graph Neural Networks

https://proceedings.neurips.cc/paper/2020/file/ba95d78a7c942571185308775a97a3a0-Paper.pdf


2006:

Fast Random Walk with Restart and Its Applications

https://www.cs.cmu.edu/~htong/pdf/ICDM06_tong.pdf


January 29, 1998:

The Page Rank Citation Ranking: Bringing Order to the Web

https://www.cis.upenn.edu/~mkearns/teaching/NetworkedLife/pagerank.pdf

AI: post transformers
The transformer architecture revolutionized the world of Neural Networks. It was a springboard for what we know today as modern artificial intelligence. This podcast focuses on modern state of the art research paper reviews starting from the transformer and on.