Win a Scott Aaronson Speculation Grant!
Exciting news, everyone! Jaan Tallinn, who many of you might recognize as a co-creator of Skype, tech enthusiast, and philanthropist, graciously invited me, along with a bunch of other nerds, to join the new Speculation Grants program of the Survival and Flourishing Fund (SFF). In plain language, that means that Jaan is giving me $200,000 to distribute to charitable organizations in any way I see fit—though ideally, my choices will have something to do with the survival and flourishing of our planet and civilization.
(If all goes well, this blog post will actually lead to a lot more than just $200,000 in donations, because it will inspire applications to SFF that can then be funded by other “Speculators” or by SFF’s usual process.)
Thinking about how to handle the responsibility of this amazing and unexpected gift, I decided that I couldn’t possibly improve on what Scott Alexander did with his personal grants program on Astral Codex Ten. Thus: I hereby invite the readers of Shtetl-Optimized to pitch registered charities (which might or might not be their own)—especially, charities that are relatively small, unknown, and unappreciated, yet that would resonate strongly with someone who thinks the way I do. Feel free to renominate (i.e., bring back to my attention) charities that were mentioned when I asked a similar question after winning $250,000 from the ACM Prize in Computing.
If you’re interested, there’s a two-step process this time:
Step 1 is to make your pitch to me, either by a comment on this post or by email to me, depending on whether you’d prefer the pitch to be public or private. Let’s set a deadline for this step of Thursday, January 27, 2022 (i.e., one week from now). Your pitch can be extremely short, like 1 paragraph, although I might ask you followup questions. After January 27, I’ll then take one of two actions in response: I’ll either
(a) commit a specified portion of my $200,000 to your charity, if the charity formally applies to SFF, and if the charity isn’t excluded for some unexpected reason (5 sexual harassment lawsuits against its founders or whatever), and if one of my fellow “Speculators” doesn’t fund your charity before I do … or else I’ll
(b) not commit, in which case your charity can still apply for funding from SFF! One of the other Speculators might fund it, or it might be funded by the “ordinary” SFF process.
Step 2, which cannot be skipped, is then to have your charity submit a formal application to SFF. The application form isn’t too bad. But if the charity isn’t your own, it would help enormously if you at least knew someone at the charity, so you could tell them to apply to SFF. Again, Step 2 can be taken regardless of the outcome of Step 1.
The one big rule is that anything you suggest has to be a registered, tax-exempt charity in either the US or the UK. I won’t be distributing funds myself, but only advising SFF how to do so, and this is SFF’s rule, not mine. So alas, no political advocacy groups and no individuals. Donating to groups outside the US and UK is apparently possible but difficult.
While I’m not putting any restrictions on the scope, let me list a few examples of areas of interest to me.
- Advanced math and science education at the precollege level: gifted programs, summer camps, online resources, or anything, really, that aims to ensure that the next Ramanujan or von Neumann isn’t lost to the world.
- Conservation of endangered species.
- Undervalued approaches to dealing with the climate catastrophe (including new approaches to nuclear energy, geoengineering, and carbon capture and storage … or even, e.g., studies of the effects of rising CO2 on cognition and how to mitigate them).
- Undervalued approaches to preventing or mitigating future pandemics—basically, anything dirt-cheap that we wish had been done before covid.
- Almost anything that Scott Alexander might have funded if he’d had more money.
- Anything that would enrage the SneerClubbers or those who attack me on Twitter, by doing stuff that even they would have to acknowledge makes the world better, but that does so via people, organizations, and means that they despise.
Two examples of areas that I don’t plan to focus on are:
- AI-risk and other “strongly rationalist-flavored” organizations (these are already well-covered by others at SFF, so that I don’t expect to have an advantage), and
- quantum computing research (this is already funded by a zillion government agencies, companies, and venture capitalists).
Anyway, thanks so much to Jaan and to SFF for giving me this incredible opportunity, and I look forward to seeing what y’all come up with!