There are lots of great charitable giving opportunities out there right now.
The first time that I served as a recommender in the Survival and Flourishing Fund (SFF) was back in 2021. I wrote in detail about my experiences then. At the time, I did not see many great opportunities, and was able to give out as much money as I found good places to do so.
How the world has changed in three years.
I recently had the opportunity to be an SFF recommender for the second time. This time I found an embarrassment of riches. Application quality was consistently higher, there were more than twice as many applications, and essentially all applicant organizations were looking to scale their operations and spending.
That means the focus of this post is different. In 2021, my primary goal was to share my perspective on [...]
---
Outline:
(01:39) A Word of Warning
(02:44) Use Your Personal Theory of Impact
(04:13) Use Your Local Knowledge
(05:10) Unconditional Grants to Worthy Individuals Are Great
(06:55) Do Not Think Only On the Margin, and Also Use Decision Theory
(07:48) And the Nominees Are
(10:55) Organizations that Are Literally Me
(11:10) Balsa Research
(12:56) Don’t Worry About the Vase
(14:19) Organizations Focusing On AI Non-Technical Research and Education
(14:37) The Scenario Project
(15:48) Lightcone Infrastructure
(17:20) Effective Institutions Project (EIP)
(18:06) Artificial Intelligence Policy Institute (AIPI)
(19:10) Psychosecurity Ethics at EURAIO
(20:07) Pallisade Research
(21:07) AI Safety Info (Robert Miles)
(21:51) Intelligence Rising
(22:32) Convergence Analysis
(23:29) Longview Philanthropy
(24:27) Organizations Focusing Primary On AI Policy and Diplomacy
(25:06) Center for AI Safety and the CAIS Action Fund
(26:00) MIRI
(26:59) Foundation for American Innovation (FAI)
(28:58) Center for AI Policy (CAIP)
(29:58) Encode Justice
(30:57) The Future Society
(31:42) Safer AI
(32:26) Institute for AI Policy and Strategy (IAPS)
(33:13) AI Standards Lab
(34:05) Safer AI Forum
(34:40) CLTR at Founders Pledge
(35:54) Pause AI and Pause AI Global
(36:57) Existential Risk Observatory
(37:37) Simons Institute for Longterm Governance
(38:21) Legal Advocacy for Safe Science and Technology
(39:17) Organizations Doing ML Alignment Research
(40:16) Model Evaluation and Threat Research (METR)
(41:28) Alignment Research Center (ARC)
(42:02) Apollo Research
(42:53) Cybersecurity Lab at University of Louisville
(43:44) Timaeus
(44:39) Simplex
(45:08) Far AI
(45:41) Alignment in Complex Systems Research Group
(46:23) Apart Research
(47:06) Transluce
(48:00) Atlas Computing
(48:45) Organizations Doing Math, Decision Theory and Agent Foundations
(50:05) Orthogonal
(50:47) Topos Institute
(51:37) Eisenstat Research
(52:13) ALTER (Affiliate Learning-Theoretic Employment and Resources) Project
(53:00) Mathematical Metaphysics Institute
(54:06) Focal at CMU
(55:15) Organizations Doing Cool Other Stuff Including Tech
(55:26) MSEP Project at Science and Technology Futures (Their Website)
(56:26) ALLFED
(57:51) Good Ancestor Foundation
(59:10) Charter Cities Institute
(59:50) German Primate Center (DPZ) – Leibniz Institute for Primate Research
(01:01:08) Carbon Copies for Independent Minds
(01:01:44) Organizations Focused Primarily on Bio Risk
(01:01:50) Secure DNA
(01:02:46) Blueprint Biosecurity
(01:03:35) Pour Domain
(01:04:17) Organizations That then Regrant to Fund Other Organizations
(01:05:14) SFF Itself (!)
(01:06:10) Manifund
(01:08:02) AI Risk Mitigation Fund
(01:08:39) Long Term Future Fund
(01:10:16) Foresight
(01:11:08) Centre for Enabling Effective Altruism Learning and Research (CEELAR)
(01:11:43) Organizations That are Essentially Talent Funnels
(01:13:40) AI Safety Camp
(01:14:23) Center for Law and AI Risk
(01:15:22) Speculative Technologies
(01:16:19) Talos Network
(01:17:11) MATS Research
(01:17:48) Epistea
(01:18:52) Emergent Ventures (Special Bonus Organization, was not part of SFF)
(01:20:32) AI Safety Cape Town
(01:21:08) Impact Academy Limited
(01:21:47) Principles of Intelligent Behavior in Biological and Social Systems (PIBBSS)
(01:22:34) Tarbell Fellowship at PPF
(01:23:32) Catalyze Impact
(01:24:32) Akrose
(01:25:14) CeSIA within EffiSciences
(01:25:59) Stanford Existential Risk Initiative (SERI)
---
First published:
November 29th, 2024
Source:
https://www.lesswrong.com/posts/9n87is5QsCozxr9fp/the-big-nonprofits-post
Narrated by TYPE III AUDIO.
---
Images from the article:
Apple Podcasts and Spotify do not show images in the episode description. Try Pocket Casts, or another podcast app.