Not fundedGrant
$0raised

Project summary

This is a meta-AI Safety project with potential for outsize impact.

The goal is to secure new sources of funding for AI Safety nonprofits through connections with non-EA grant programs and High Net Worth Individuals.

AI Safety work is funding constrained. Many organisations rely on a small cluster of income sources such as the Long Term Future Fund and Open Philanthropy. We need to diversify our funding sources to increase resilience and reduce bottlenecks.

What are this project's goals? How will you achieve them?


Phase 1: 100% complete

  • Research non-EA funds and map opportunity space

  • Connect with grant experts beyond EA

Phase 2:  70% complete

  • Create a high quality searchable database of opportunities, including dates, timelines, amounts, key contacts

Phase 3: in progress

  • Individual consultancy with founders

  • Providing a list of opportunities which fit their needs and time frames

  • Support to secure funds including grant writing, marketing, outreach, accounting, legal

Phase 4: Expanding the pool of High Net Worth Individuals funding AI Safety

  • Generating leads through research, networking

  • Working with current funders & experts

  • Strategic relationship building to secure new top donors

How will this funding be used?


$5k- Searchable funding database of nonprofit grants and government opportunities, free grantwriting for three AI Safety organisations at risk of closure

$7.5k- Pays for access to High Net Worth Individuals and donor lists, so that I can include prospective High Net Worth donors in the main funding database. I'll also send out monthly updates on key deadlines for the next year.

$10k- Grant templates to reduce application times, free grantwriting for 5 AI Safety organisations, and a searchable webapp so you can easily find the right grant.

$50k- Sets up an HNWI outreach program in London and the Bay Area, including an international donor funding circle to bring high value AI Safety donors together.

$120k- HNWI outreach program runs for 12 months, highest change of success

Who is on your team? What's your track record on similar projects?

Angie Normandale - Oxford all time top alumni fundraiser, 10y experience, lawyer, project manager, founder with six figure seed round, part time CompSci MSc, previously did this work at PIBBSS

Advisors:

Chris Akin- COO Apollo Research

Professor Mike X Cohen - seasoned principal investigator with large academic fund experience

Will Portnof -Bay Area Philanthropic consultant


Organisations that have expressed interest:

Apart Labs

Pause AI

Far AI

LISA

Athena

PIBBSS

Apollo

MATS

Epoch

Impact Academy

BlueDot

& others.

What are the most likely causes and outcomes if this project fails?

Research suggests that US-based nonprofits spend between 10-30% of their annual budgets on fundraising.

The likely alternative is paying external consultants to fundraise.

In August 2024 I met with grant consultants from the US, UK, and Australia to look at funding for PIBBSS.

Consultants charged up to $6k per organization per month, or up to 15% of the fundraise in commission.

Despite fundraising expertise, they struggled to comprehend our niche and value add. It appears that the AI Safety Space is unusual compared to other research fields and requires an inside view.

Alternatively someone else in EA might step forward to do this work. Nobody has volunteered thus far but there’s certainly a market for this work!

What other funding are you or your project getting?


None applied for. This manifund should seed the setup costs for a self-sustaining project. The lifespan of the work depends on success rate and the wider field.

Please contact Angie for further questions about the project: g.normandale@gmail.com

NeelNanda avatar

Neel Nanda

4 months ago

Interesting project! To go well, it seems like the main project person needs to be good at a few things:

  1. Having a good network in the grants/funding space, across a fairly diverse range of funders (given your reply to Ryan)

    1. Plausibly, being the kind of person who could build a good network fast and has some existing connections, would suffice?

    2. Or maybe a bunch of these funders don't care as much about personal connections, and have open applications, and you can just collect info about those?

  2. Being good at translation: understanding the context of funders in a range of fields, their language/culture/what they look for, and understanding the AI Safety orgs and being able to sell them effectively

Do you have much evidence that shows you're good at those two things? (Also feel free to push back against this model, or point out other key skills I am missing!)

NeelNanda avatar

Neel Nanda

4 months ago

@NeelNanda Oh, I'd also love to hear more about the story behind "Oxford all time top alumni fundraiser", what does that actually mean, and how?

AngieNormandale avatar

Angie Normandale

4 months ago


@NeelNanda Hi Neel, thanks for commenting!

Re Oxford - it was my first job, cold-calling alumni to raise money for the university. Cold calling isn't for anyone but I loved it. According to the Development Officer I was the most successful fundraiser they had ever had, in her considerable experience. She didn't tell me how much money I raised, but I was invited back to work several times and she encouraged me to pursue a career in fundraising. I'm sure she would give me a reference. I interned at Giving What We Can in grant writing but decided to go into law as it was too slow paced for me.

I am still in touch with our alumni team, and regularly attend events with the entrepreneurs network, Oxford Bay Area Angels, etc. One of the first things I'd do to get more HNWIs for AI Safety is give the current development officer a call.

Where this experience diverges is the timeframe. According to the research, HNWIs tend to need more time and personal engagement as they want to make specific decisions with their money. They also prefer advice from each other. This is where donor circles come in, Manifund and Founders Pledge being examples. One strategy would be to run an international donor circle specifically focused on AI Safety, by invite only.

AngieNormandale avatar

Angie Normandale

4 months ago

Team strengths
I'm leveraging expert advisors with decades of experience in their specific areas:
-AI Safety Research: Chris from Apollo
-Understanding the funding environment: Will Portnof
-Academic grant expertise: Mike X Cohen

My relevant experience:

Prior to working in AI, I was head of an office for a marketing company. I was responsible for opening the first regional office. I hired, trained, and managed a team of sales people and junior managers, created and executed strategy roadmaps, and had the new office revenue generating in six months.

My big achievement was making 160 nonprofit partnerships. This had 4x impact on sales. The approach was totally new to our firm, so I had to work hard to pitch it to the CEO. It meant reframing our work with nonprofit language (service users vs customers, impact vs revenue) and also changing how we approached our clients.

I spent a few years running a start up and then VC-ing. I have strong global networks, experience pitching and understanding what VCs look for. I'm currently using this to connect Athena with women in tech networks.

I also spent five years as a public lawyer. This involved lots of translation, explaining complex legislation to my clients and also understanding the needs of different stakeholders to negotiate a solution. I developed advocacy skills from speaking in parliament and at the Royal Courts of Justice.

AngieNormandale avatar

Angie Normandale

4 months ago

Understanding the two funding sources
It might help to get some surface area on the funding sources I'm looking at:

Grants outside EA
From experience at PIBBSS, I found a lot of non-EA grantmakers interested in AI Safety. They have open applications and are less swayed by networking. My strategy is to do the legwork and adapt our applications to the funding criteria.

Some key differences outside EA:

1. Neglectedness is bad.
Neglectedness might indicate a lack of interest from the broader field and thus lower impact work.

2. Admin is essential.
An accounting record is just as important as a publishing record. Grantmakers want to see a very specific breakdown of where the money will be spent, and they want proof.

3. Ethics requires sacrifice.
Ethical work means diversity and inclusion and engaging with the general public. These actions exchange certainty in value for money with a culture of community and sacrifice.

My plan is to showcase our strengths and develop where needed. I work with fiscal sponsors on accounting, and emphasise fellowships as public engagement.

High Net Worth Individuals
This is a higher-risk-high-reward strategy.

It will take longer to build these targeted connections- months rather than weeks for a grant application. It involves more uncertain bets e.g. set up costs for a donor circle.

If the community wanted to fund this work, I would set up an accountability structure e.g. fiscal sponsorship.
@AngieNormandale

RyanKidd avatar

Ryan Kidd

4 months ago

  1. How will you add value to donor outreach compared to organizations with a similar mission, such as Effective Giving, Founders Pledge, Future of Life Institute, Giving What We Can, Longview Philanthropy, etc.?

  2. If you could, in what ways would you improve distributed funding projects like Manifund AI Safety Regranting, ACX Grants, GiveWiki, etc.?

AngieNormandale avatar

Angie Normandale

4 months ago

@RyanKidd Hi Ryan, thanks for your questions! 

Concern about diversifying funding sources has been recently raised by Open Phil, AIM, the meta co-ordination forum, and Rethink Priorities, so clearly there’s a general need for this. The aim is to add value by taking a different approach to outreach: 

  1. Representing organizations rather than donors

Rather than analyzing nonprofits, this project puts the AI Safety orgs’ needs first and looks for the donors best placed to meet them.

  1. Meeting funders with a wider variety of interests 

EA donor outreach tends to follow an ideas-first approach, persuading donors to subscribe to a particular set of premises and conclusions about doing the most good. It takes a lot of work to get people on board, but you end up with a small number of extremely committed donors. 

This is the opposite approach. We look for opportunities outside the EA/longtermist/rationalist sphere, and find the points of convergence with AI Safety. Meeting donors where they’re at in terms of interests is much easier than persuading them to subscribe to an entirely new worldview. Also this gives us access to large government grants and research funds, which tend to be more reliable than individual donors. 

  1. Explicitly focusing on infrastructure and communication

From initial scoping, many grants outside of EA are far more general about the cause area, and instead make judgements based on the veracity of the applicant- their financial history, auditability, experience, reputation in the field, and ability to plan and manage large grants. Organizations will need to adapt their strategies to communicate effectively with these different audiences and interests, so that’s where I can hopefully add a lot of value. 

  1. Diversifying research opportunities

Just as ACX and Manifund’s AI safety regranters have their own research interests, diversifying funders should foster work with a wider variety of research tastes. 

From preliminary discussions, here’s some examples of what this work could look like:

  • Seeking women in tech funding to support Athena 2.0 

  • Doing the legwork to help researchers to get stipends for short term projects, such as the MATS extension, Arena, Apart etc., supporting applications to many small funds and academic institutions. 

  • Supporting a team to move to a mixed commercial model, thus massively increasing their funding pot for research (Lakera’s model) 

  • Applications to academic grants which are outside the technical expertise of Open Phil e.g. bioinformatics research which could inform AI Safety, research about vulnerable users of technology eg children, the elderly 

  • Seeking out art, theater, journalism, and education grants for public awareness campaigns around AI Safety, for Pause AI and others

  • Helping LISA to build towards recognised public research institution status in the UK, providing access to millions in UK government grants for scientific research

  • Looking at government funding to build out international partnerships e.g. US-India AI Safety and governance programs 

Regarding how I’d improve Manifund, ACX regranting, GiveWiki etc, it would be great to widen the pool of both donors and interests. With the caveat that I’m a lawyer and love bureaucracy, I do think the lack of bureaucracy perpetuates uneven outcomes. There needs to be trust between donors and recipients. Without a formal application structure we rely on building trust through other fora. This becomes self-reinforcing and makes it difficult for international recipients and those outside the inner circle. So I’d like to see more space for international organizations to connect with potential funders. 

I’m also very keen to see more women in the alignment space, there’s been great work on this front recently and I’m looking forward to supporting initiatives like Athena in future.  

I hope this answers your questions, feel free to ask more or reach out!