In this episode I discuss Mustafa Suleyman’s book The Coming Wave, and explore why rapid diffusion, opaque decision-making, and runaway technological acceleration pose serious risks to public trust and global stability. I break down the core dynamics of containment, including Goodhart’s law, gatekeeping, fat-tailed risks, feedback lag, irreversibility, and the alignment tax. I then relate these dynamics to Suleyman’s proposed ten-point plan for navigating this "narrow path" of AI containment....
All content for The Book Recall is the property of Sean McClure and is served directly from their servers
with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
In this episode I discuss Mustafa Suleyman’s book The Coming Wave, and explore why rapid diffusion, opaque decision-making, and runaway technological acceleration pose serious risks to public trust and global stability. I break down the core dynamics of containment, including Goodhart’s law, gatekeeping, fat-tailed risks, feedback lag, irreversibility, and the alignment tax. I then relate these dynamics to Suleyman’s proposed ten-point plan for navigating this "narrow path" of AI containment....
In this episode, I discuss Chris Miller’s book Chip War, and use it to argue that the real lesson of the semiconductor race isn’t just about who controls the technology, but who possesses the tacit knowledge—the deeply embedded, experience-based know-how—that makes technology work, and why that makes global dependency far harder to escape than it seems. Support the show Become a premium member to gain access to premium content, including the Techniques and Mindsets Videos, visual concept summ...
The Book Recall
In this episode I discuss Mustafa Suleyman’s book The Coming Wave, and explore why rapid diffusion, opaque decision-making, and runaway technological acceleration pose serious risks to public trust and global stability. I break down the core dynamics of containment, including Goodhart’s law, gatekeeping, fat-tailed risks, feedback lag, irreversibility, and the alignment tax. I then relate these dynamics to Suleyman’s proposed ten-point plan for navigating this "narrow path" of AI containment....