Thanks for your comment and question.
Our strategy to increase distribution of the newsletter:
- Increase the quality of the newsletter such that the organic growth rate increases
- Contact team members of safety teams of major AI labs and academic centers (supernodes) to share it on their internal communications channels
- Share it on the AI Alignment Slack (#events), and potentially other popular events-related channels
- Nudge maintainers of onboarding resources for new AI safety researchers to include it as a recommended resource
This is tentative and open to feedback. We've grown to ~1K subscribers so far with organic growth but the resource would likely be beneficial to at least 2-3x more people. We would attempt proactive growth in 2025.
@orpheus
$0 in pending offers
Projects
Outgoing donations
Comments
Orpheus Lummis
about 1 month ago
Orpheus Lummis
about 1 month ago
I believe PIBSS is a solid project – recruiting senior researchers into AI safety research, developing rigorous theoretical foundations backed by empirical work, and consistently producing valuable research outputs.
Orpheus Lummis
about 1 month ago
Liron's interviews are useful contributions to the public discourse and are on a good trajectory. They have potential to engage broader audiences. Additionally, I've personally benefited from them.
Orpheus Lummis
about 1 month ago
Progress at the intersection of Live Machinery and governance processes / public administration could have significant potential upsides.
I've met Murray in person, and I've been impressed by their insight on law, governance, history, collaborativeness, care, and potentialities of leveraging AI tech in this space.
Orpheus Lummis
2 months ago
redesigning the machinery (i.e. interfaces, infrastructure, methodology) underlying formal theories to take advantage of AI assistance seems worth exploring 👍
Transactions
For | Date | Type | Amount |
---|---|---|---|
Live Governance | 4 days ago | project donation | 20 |
PIBBSS - General Programs funding or specific funding | about 1 month ago | project donation | 20 |
Doom Debates - Podcast & debate show to help AI x-risk discourse go mainstream | about 1 month ago | project donation | 20 |
10th edition of AI Safety Camp | about 1 month ago | project donation | 20 |
Lightcone Infrastructure | about 1 month ago | project donation | 20 |
Manifund Bank | about 1 month ago | deposit | +100 |
[AI Safety Workshop @ EA Hotel] Autostructures | about 2 months ago | project donation | 150 |
Fatebook and Quantified Intuitions | 4 months ago | project donation | 50 |
CEEALAR | 4 months ago | project donation | 100 |
AI Animals and Digital Minds 2025 | 4 months ago | project donation | 100 |
MATS Program | 4 months ago | project donation | 50 |
Lightcone Infrastructure | 4 months ago | project donation | 50 |
Manifund Bank | 5 months ago | deposit | +500 |