AI Safety Regranting

We partner with regrantors: experts in the field of AI safety, each given an independent budget. Regrantors recommend grants based on their personal expertise; Manifund reviews these recommendations and distributes the funds.

Why regranting?

  • Hidden opportunities: Regrantors can tap into their personal networks, giving to places that donors and grantmaking organizations might miss. Rather than wait for an application, regrantors can reach out to grantees to initiate new projects.
  • Fast: The regrantor is responsible for the budget, rather than a committee. The regranting model requires less overhead than traditional grantmaking, so we can make grants in days, not months.
  • Flexible: Regrantors can give to projects that are not yet registered as charities, or to individuals; Manifund acts as the fiscal sponsor, complying with 501c3 requirements and allowing tax benefits for donors.
  • Trust-based: Newer fields like AI safety can be speculative, opaque, and nascent, making it harder for donors to know where to direct their money. Regranting helps donors to outsource these decisions to individuals with deep expertise.

Our regranting program is inspired by the success of programs like the Future Fund's regrants, SFF's speculation grants, and Fast Grants.

Example regrants

NeelNanda avatar

Neel Nanda

gave $11K
I think that understanding, detecting and potentially mitigating chain of thought unfaithfulness is a very important problem, especially with the rise of o1 models... I think Arthur is fairly good at supervising projects, and that under him Jett and Ivan have a decent shot of making progress, and that enabling this to start a month earlier is clearly a good idea.
LeopoldAschenbrenner avatar
I think Epoch has done truly outstanding work on core trends in AI progress in the past few years. I'm also excited by their recent foray into benchmarking in the form of FrontierMath... Better benchmarks that help us forecast time to AGI (and especially time to relevant capabilities, such as automated AI research) and do so in a highly credible and scientific way are very valuable for informing policymakers and catalyzing important policy efforts.
AdamGleave avatar

Adam Gleave

gave $50K
I've generally been impressed by how well Timaeus have executed. They've in short order assembled a strong team who are collaborating & working well together, producing substantial research and outreach outputs. They have a distinctive research vision, and I think deserve some credit for popularizing studying the evolution of networks throughout training from an interpretability perspective with e.g. EleutherAI's interpretability team now pursuing their own "development interpretability" flavored research.

How does regranting on Manifund work?

  1. A donor adds money to their Manifund account (which constitutes a tax-deductible donation to our 501c3 nonprofit).
  2. The donor can then allocate the money between regrantors of their choice (or they can give directly to projects).
  3. Regrantors choose which opportunities, including projects posted on Manifund through our open call or projects they learn about elsewhere, to spend their budgets on, writing up an explanation for each grant made.
  4. We review the grant to make sure it is legitimate, legal, and aligned with our mission.
  5. If we approve the grant, the money will be transferred to the grantee's Manifund account, at which point they request to withdraw and we send them their funds.

FAQ

Who can see the information about grants?

Currently all grant information is made public. This includes the identity of the regrantor and grant recipient, the project description, the grant size, and the regrantor’s writeup.

We strongly believe in transparency as it allows for meaningful public feedback, accountability of decisions, and establishment of a regrantor track records. We recognize that not all grants are suited for publishing; for now, we recommend such grants be made through other funders, such as the Long Term Future Fund, the Survival and Flourishing Fund, or Open Philanthropy.

What kinds of projects are eligible for regranting?

We have no official cause-area restrictions on grants, though most of our regrantors are focused on mitigating global catastrophic risk, specifically on AI safety.

We support regrants to registered charities and individuals. For-profit organizations may also be eligible, pending due diligence. As a US-registered 501c3, we do not permit donations to political campaigns.

We look over all grants before fulfilling withdrawal requests to make sure they meet these requirements. We reserve the right to veto grants for any reason, though we expect to often defer to our regrantors’ judgement.

Can regrantors send money to themselves?

In certain circumstances, we permit regrantors to donate to their own projects, though we evaluate these projects with a more rigorous bar before fulfilling withdrawal requests.

Can I contribute funds to the regrantor budgets?

Yes! We're looking for contributions to our AI Safety regrantor budgets. Get in touch with Austin (austin@manifund.org) if you're interested in contributing.