Home
Categories
EXPLORE
True Crime
Comedy
Business
Society & Culture
Sports
News
Health & Fitness
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/aa/6e/c5/aa6ec535-4bf6-6000-c0c7-d14efada8206/mza_9225735319382486240.jpg/600x600bb.jpg
The AI Concepts Podcast
Sheetal ’Shay’ Dhar
49 episodes
1 week ago
Show more...
Technology
Education,
Courses,
Science
RSS
All content for The AI Concepts Podcast is the property of Sheetal ’Shay’ Dhar and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Show more...
Technology
Education,
Courses,
Science
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/aa/6e/c5/aa6ec535-4bf6-6000-c0c7-d14efada8206/mza_9225735319382486240.jpg/600x600bb.jpg
Module 2: Attention Is All You Need (The Concept)
The AI Concepts Podcast
11 minutes
1 week ago
Module 2: Attention Is All You Need (The Concept)
Shay breaks down the 2017 paper "Attention Is All You Need" and introduces the transformer: a non-recurrent architecture that uses self-attention to process entire sequences in parallel. The episode explains positional encoding, how self-attention creates context-aware token representations, the three key advantages over RNNs (parallelization, global receptive field, and precise signal mixing), the quadratic computational trade-off, and teases a follow-up episode that will dive into the math behind attention.
The AI Concepts Podcast