Home
Categories
EXPLORE
True Crime
Comedy
Business
Sports
Society & Culture
History
Fiction
About Us
Contact Us
Copyright
© 2024 PodJoint
00:00 / 00:00
Sign in

or

Don't have an account?
Sign up
Forgot password
https://is1-ssl.mzstatic.com/image/thumb/Podcasts221/v4/ac/db/c0/acdbc0a6-ee25-0fa7-1dc2-b0ee790e3dfe/mza_16816488860297642949.jpg/600x600bb.jpg
Discovery Engines – with Nabil
Nabil Laoudji, YesAnd Labs LLC
13 episodes
1 week ago
Featuring the people and platforms accelerating scientific discovery. Hosted by Nabil Laoudji. More at www.discoveryengines.co
Show more...
Science
RSS
All content for Discovery Engines – with Nabil is the property of Nabil Laoudji, YesAnd Labs LLC and is served directly from their servers with no modification, redirects, or rehosting. The podcast is not affiliated with or endorsed by Podjoint in any way.
Featuring the people and platforms accelerating scientific discovery. Hosted by Nabil Laoudji. More at www.discoveryengines.co
Show more...
Science
https://d3t3ozftmdmh3i.cloudfront.net/staging/podcast_uploaded_nologo/42427172/42427172-1733945206214-20112634a507b.jpg
Building Safe AI - MIT AI Alignment's Riya Tyagi and Gatlen Culp
Discovery Engines – with Nabil
1 hour 6 minutes 32 seconds
2 months ago
Building Safe AI - MIT AI Alignment's Riya Tyagi and Gatlen Culp

Riya and Gatlen are board members of MIT AI Alignment, an organization focused on reducing risks from advanced artificial intelligence. Join us as we map the AI safety landscape, define key risks, examine generational perspectives, and explore how we can work together to build a safer future for humanity.

Hosted by Nabil Laoudji. See below for episode links, chapters, and our socials. 🙏🙌

Episode Links:

  • MIT AI Alignment
  • Cambridge Boston Alignment Initiative (CBAI)
  • Riya's LinkedIn
  • Gatlen's LinkedIn
  • Gatlen's Projects
  • Slaughterbots x Future of Life Institute
  • Track II Diplomacy
  • Tegmark AI Safety Group
  • "Nexus" by Yuval Noah Harari
  • RAND Institute
  • BlueDot Impact
  • "If Anyone Builds It, Everyone Dies" by Eliezer Yudkowsky and Nate Soares

Chapters:

  • (00:00) - Preview & Intro
  • (01:48) - What Is MAIA?
  • (02:47) - Why AI Safety?
  • (09:26) - Trends in AI Safety Interest
  • (12:46) - AI Safety Techniques: MechInterp & Beyond
  • (17:15) - Model Situational Awareness
  • (20:58) - Hybrid, Mixture of Experts Models
  • (24:40) - Decomposing a Model & Parallels with Human Brains
  • (29:00) - Private Capital for Safety Research
  • (32:23) - Frontier Lab Mentorship Programs
  • (34:14) - Policy Perspectives & China Competition
  • (36:53) - Ways In Which AI Might Threaten Us
  • (39:31) - Track 2 Diplomacy & International Collaboration Examples
  • (43:13) - Slaughterbots & Dangerous Capability Demos
  • (46:54) - AI-Driven Unemployment
  • (52:06) - Generational Attitudes Towards AI
  • (57:44) - How to Get Involved - Non Technical
  • (01:01:00) - Learning Resources: BlueDot Impact, etc
  • (01:01:46) - Importance of Communicators & Artists
  • (01:03:50) - How to Support MAIA

Connect With Us:

  • Our Newsletter
  • Twitter
  • LinkedIn
  • Bluesky
  • Feedback or questions

Platforms:

  • YouTube
  • Apple Podcasts
  • Spotify Podcasts
Discovery Engines – with Nabil
Featuring the people and platforms accelerating scientific discovery. Hosted by Nabil Laoudji. More at www.discoveryengines.co