Sveriges mest populära poddar

For Humanity: An AI Safety Podcast

Episode #26 - “Pause AI Or We All Die” Holly Elmore Interview, For Humanity: An AI Safety Podcast

112 min • 1 maj 2024

Please Donate Here To Help Promote This Show

https://www.paypal.com/paypalme/forhumanitypodcast

In episode #26, host John Sherman and Pause AI US Founder Holly Elmore talk about AI risk. They discuss how AI surprised everyone by advancing so fast, what it’s like for employees at OpenAI working on safety, and why it’s so hard for people to imagine what they can’t imagine.

This podcast is not journalism. But it’s not opinion either. This show simply strings together the existing facts and underscores the unthinkable probable outcome, the end of all life on earth. 

For Humanity: An AI Safety Podcast, is the accessible AI Safety Podcast for all humans, no tech background required. Our show focuses solely on the threat of human extinction from AI.

Peabody Award-winning former journalist John Sherman explores the shocking worst-case scenario of artificial intelligence: human extinction. The makers of AI openly admit it their work could kill all humans, in as soon as 2 years. This podcast is solely about the threat of human extinction from AGI. We’ll meet the heroes and villains, explore the issues and ideas, and what you can do to help save humanity.

TIMESTAMPS:

Resources:

Azeer Azar+Connor Leahy Podcast

Debating the existential risk of AI, with Connor Leahy

Best Account on Twitter: AI Notkilleveryoneism Memes 

https://twitter.com/AISafetyMemes

JOIN THE FIGHT, help Pause AI!!!!

Pause AI

Join the Pause AI Weekly Discord Thursdays at 3pm EST

  / discord  

https://discord.com/invite/pVMWjddaW7

22 Word Statement from Center for AI Safety

Statement on AI Risk | CAIS

https://www.safe.ai/work/statement-on-ai-risk


Kategorier
Förekommer på
00:00 -00:00