A friendly, intuitive tour of Hidden Markov Models (HMMs). Using the relatable 'full trash bin means he's home' metaphor, we explore how to infer unseen states from noisy observations, learn the model parameters with Baum–Welch, and decode the most likely state sequence with the Viterbi algorithm. You’ll see how forward–backward smoothing combines evidence from past and future, and how these ideas power real-world AI—from speech recognition to gene finding and beyond. Note: This podcast was ...
All content for Intellectually Curious is the property of Mike Breault and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
A friendly, intuitive tour of Hidden Markov Models (HMMs). Using the relatable 'full trash bin means he's home' metaphor, we explore how to infer unseen states from noisy observations, learn the model parameters with Baum–Welch, and decode the most likely state sequence with the Viterbi algorithm. You’ll see how forward–backward smoothing combines evidence from past and future, and how these ideas power real-world AI—from speech recognition to gene finding and beyond. Note: This podcast was ...
The Imperial Jade: Nephrite and the Global History of a Stone
Intellectually Curious
5 minutes
1 week ago
The Imperial Jade: Nephrite and the Global History of a Stone
A global tour of nephrite jade, the “imperial gem” prized above gold. We explore its buttery mutton-fat luster and legendary toughness, why interlocking tremolite/actinolite fibers make it famously hard to break, and how this one mineral connected civilizations from ancient China and the Silk Road to the maritime jade routes of Southeast Asia and New Zealand. From tools and ornaments to currency and sacred heirlooms, nephrite has shaped culture, trade, and ritual across millennia. Note: This...
Intellectually Curious
A friendly, intuitive tour of Hidden Markov Models (HMMs). Using the relatable 'full trash bin means he's home' metaphor, we explore how to infer unseen states from noisy observations, learn the model parameters with Baum–Welch, and decode the most likely state sequence with the Viterbi algorithm. You’ll see how forward–backward smoothing combines evidence from past and future, and how these ideas power real-world AI—from speech recognition to gene finding and beyond. Note: This podcast was ...