Home
Categories
EXPLORE
True Crime
Comedy
Business
Sports
Society & Culture
Health & Fitness
TV & Film
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/aa/6e/c5/aa6ec535-4bf6-6000-c0c7-d14efada8206/mza_9225735319382486240.jpg/600x600bb.jpg
The AI Concepts Podcast
Sheetal ’Shay’ Dhar
49 episodes
1 week ago
Show more...
Technology
Education,
Courses,
Science
RSS
All content for The AI Concepts Podcast is the property of Sheetal ’Shay’ Dhar and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Show more...
Technology
Education,
Courses,
Science
https://is1-ssl.mzstatic.com/image/thumb/Podcasts211/v4/aa/6e/c5/aa6ec535-4bf6-6000-c0c7-d14efada8206/mza_9225735319382486240.jpg/600x600bb.jpg
Module 2: The MLP Layer - Where Transformers Store Knowledge
The AI Concepts Podcast
7 minutes
1 week ago
Module 2: The MLP Layer - Where Transformers Store Knowledge
Shay explains where a transformer actually stores knowledge: not in attention, but in the MLP (feed-forward) layer. The episode frames the transformer block as a two-step loop: attention moves information between tokens, then the MLP transforms each token’s representation independently to inject learned knowledge.
The AI Concepts Podcast