Home
Categories
EXPLORE
True Crime
Comedy
Society & Culture
History
Business
Sports
News
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts221/v4/6c/0a/bd/6c0abd57-bad3-750e-f82e-8738e7eddbb1/mza_16926857403597380292.jpg/600x600bb.jpg
Simple Science Deep Dive
Nguyen K. Tram, Ph.D.
28 episodes
6 days ago
Cut through the jargon and get to the heart of groundbreaking research. Simple Science Deep Dive translates complex studies into stories you can understand. *Disclaimer: The content of this podcast was generated by NotebookLM and has been reviewed for accuracy by Dr. Tram.*
Show more...
Science
RSS
All content for Simple Science Deep Dive is the property of Nguyen K. Tram, Ph.D. and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Cut through the jargon and get to the heart of groundbreaking research. Simple Science Deep Dive translates complex studies into stories you can understand. *Disclaimer: The content of this podcast was generated by NotebookLM and has been reviewed for accuracy by Dr. Tram.*
Show more...
Science
https://d3t3ozftmdmh3i.cloudfront.net/staging/podcast_uploaded_episode/43673820/43673820-1765731767451-9f449e8158d51.jpg
Training Neural Networks to Use Time Like the Brain Does
Simple Science Deep Dive
14 minutes
2 months ago
Training Neural Networks to Use Time Like the Brain Does

Featured paper: [**Efficient event-based delay learning in spiking neural networks**](https://doi.org/10.1038/s41467-025-65394-8)

What if AI could learn to use time the way your brain does, with a fraction of the energy? In this episode, we explore groundbreaking research that's revolutionizing spiking neural networks by teaching them to master synaptic delays. Discover how this brain-inspired approach uses sparse, event-driven spikes instead of constant data streams, slashing energy consumption while processing temporal information. We dive into the breakthrough EventProp algorithm that calculates exact gradients for both connection weights and delays, running 26 times faster than previous methods while using half the memory. Learn why adding learnable delays transforms small networks into powerhouses, achieving state-of-the-art accuracy with five times fewer parameters on speech recognition tasks. Join us as we unpack how this event-based training is paving the way for neuromorphic hardware that thinks like the brain but runs on just 20 watts of power. Perfect for anyone fascinated by the future of energy-efficient AI that truly understands the language of time.*Disclaimer: This content was generated by NotebookLM. Dr. Tram doesn't know anything about this topic and is learning about it.*

Simple Science Deep Dive
Cut through the jargon and get to the heart of groundbreaking research. Simple Science Deep Dive translates complex studies into stories you can understand. *Disclaimer: The content of this podcast was generated by NotebookLM and has been reviewed for accuracy by Dr. Tram.*