I'm leading the forecasting group at the University of Washington, and I can't express enough how helpful OPTIC (and in particular Saul) have been to setting everything up! I've been working with Saul for the past few months on creating the forecasting curriculum and planning out every aspect of the UW club. So far, I think we've seen incredible success for the beginning of a long-term project; we've signed up multiple universities in the US and even one in Argentina!
Like Misha, I would use any funds granted purely to run the club--registration fees, food, prizes, etc...I want to emphasize the truth behind Misha's quote "I've observed food and prizes to be a significant factors for college students deciding what events to attend :)", could not be more accurate, we're all broke!
In addition though, one of the lead drivers of getting people in the room (and thus achieving Goal 1) is to attach corporate sponsors to the club. Nowadays, especially at UW, college is primarily about securing a solid job ahead of all of your other classmates. Corporate sponsorship, or at the minimum, demonstrated interest, is the largest pull factor we as a club can provide. If funding via regranting is not on the table, we'd also love to connect with any companies/organizations who would be interested in hiring applicants talented at forecasting. If you know of such an organization, reach out to me (jasuch@uw.edu) or Saul!
University Forecasting Clubs [OPTIC]
You can alternatively view this document on Notion here.
Summary
Funding to start & support 3-5 university forecasting clubs for one semester. Mainline request is $8.4k (~$1.6k per club + $500 for instructional content), used for club activities (events, food, books, posters, etc.) and instructional content (a syllabus and workshop styled after an EA introductory fellowship).
This is a subproject of OPTIC, which runs intercollegiate forecasting competitions. OPTIC is providing the lead organizers, materials, support, mentorship, etc for this project.
I’d be happy to chat more about this over a call with any regrantors who’re interested. Book a 25 or 50 minute call here: savvycal.com/saulmunn/optic :)
What are this project's goals and how will you achieve them?
High-Level Goals
OPTIC aims to make forecasting a commonplace collegiate academic activity, like debate clubs, math teams, or hackathons. Our theory of change for university level forecasting:
Create a larger, more diverse pool of emerging forecasters and superforecasters, which increases the quality of individual & aggregated forecasts.
Normalize the field of forecasting to the general public, strengthening public support, knowledge, and awareness of the practice of rigorous forecasting
Improve future institutional decision making to reduce and the scale and likelihood of future catastrophes.
Our goal is to add an intellectually energetic field of college students to the existing forecasting community through participation in engaging and enjoyable club activities. Workshops & sessions will refine their forecasting skills & improve their ability to contribute to the forecasting community and to the future decision-making of key institutions. Currently, students say they compete for their school’s debate team or MUN team: we want them to say they compete for their school’s forecasting team.
So far, OPTIC has been focused on organizing intercollegiate forecasting competitions, but we began to realize how effective university forecasting clubs would be after running our pilot competition. Many competitors indicated some level of interest in starting a club, and we expect a more decentralized version of forecasting outreach at universities — i.e. clubs — could build the collegiate forecasting community in parallel with intercollegiate competitions.
Linch’s post also includes a lot of great details about potentially great ways that forecasting (generally) can improve the longterm future; we contribute to some (but not all).
Detailed Goals
All are for EOY 2023. All goals are in order of importance.
Ideal (>75th percentile outcomes; above expectations)
Have created ≥6 effective and promising forecasting clubs
effective:
generally, we want the goal of forecasting clubs to be “promote and practice forecasting and prediction markets among undergrads.” Effectiveness is evaluated based on achievement of that (nebulous) goal; more specifically, we’d want clubs to have gathered an average of >10 students/club to improve their engagement and interest in the forecasting community. Some examples:
took students from “what is forecasting” to “I’d attend an OPTIC competition and feel comfortable competing”
took students from “I read Superforecasting and every once in a while go on Manifold/metaculus/polymarket/etc” to “I meet with my forecasting club weekly and we spend an hour or two chatting about forecasting and going through Metaculus/Manifold/Polymarket/etc”
promising:
80% likelihood that they’ll have grown in effectiveness from jan1 2024 to jan1 2025
note that “have grown in effectiveness” includes “has continued to exist”
Have a substantially better understanding of how to create & encourage effective and promising forecasting clubs in a way that’s highly actionable for future clubs
substantially is fairly subjective; not sure how to make this goal more specific. I think part of the point is that we’re not entirely sure what data we’ll get, and how it’ll be actionable; part of the goal is to understand the goal better.
some examples of uncertainties that we would want answers to:
collect more information about which club activities lead to more engagement and better forecasting performance
what kinds of support for forecasting clubs are most cost-effective
what are the biggest pain points between someone being interested in starting a club and then actually starting a club
the goal is that future years will have many more forecasting clubs being formed — we want this year’s cohort to enable future clubs to be more effective and more promising.
ex: have collected feedback forms from all club presidents from each of their students about instructional materials, promotional content, vibes, etc
Have contributed strongly to the field of forecasting
more nebulous goal — examples might include (a) student(s) publishing (a) research paper(s) on forecasting, or being hired by (a) forecasting organization(s), or even simply the act of many individuals contributing local knowledge to forecasting aggregation platforms to improve public judgement
Median (25th to 75th percentile outcomes; expected)
Each of the following are defined similarly to the section above.
Have created 2-5 effective and promising forecasting clubs
Have a better understanding of how to create & encourage effective and promising forecasting clubs in a way that’s actionable for future clubs
subjectively different from the “Ideal” outcome
Have contributed to the field of forecasting
Failure (<25th percentile outcome; below expectations)
Generally, if any of the three bullet points in the Median section above don’t happen, we’d consider that a failure and beneath a 25th percentile outcome.
See the “premortem” section for more details.
How will this funding be used?
Guesstimate model is here! A breakdown of the Guesstimate model is below, but for specific details and numbers, look to the model.
Mainline: $8,400
Includes funding for:
5 clubs at ~$1.6k per club ($7.9k total). Main club expenses include:
events (about 1/3);
outreach (about 1/2);
software (about 1/6), and
a 15% general buffer
$500 to Jack Such for building instructional content
Ideal: $26,200 [SEE NOTE]
NOTE: for us to actually end up using $26,200, things will have to have went fabulously well (e.g. clubs got 30-50 members first semester, ran events with 100-200 people, all of those who’ve mentioned “low interest” in starting a club actually end up starting one, etc). This is not likely. If regrantors decide to fund us at this level, the most likely scenario is that we would use somewhere around the mainline amount and give the rest back; we would use the rest on the off chance that club events go fabulously well and there is sufficient need for it.
Includes funding for 7 clubs at ~$3.6k per club ($25.2k total). Main club expenses include:
events (about 3/8);
outreach (about 1/2);
software (about 1/8), and
a 20% general buffer
$1,000 to Jack Such for building instructional content
Minimum: $1,700
Includes funding for:
3 clubs at ~$600 per club ($1.7k total). Main club expenses include:
events (about 2/3);
outreach (about 1/3), and
a 10% general buffer
$0 to Jack Such for building instructional content.*
*Jack mentioned some willingness to complete the instructional content for free, but I would much prefer to give fair, financial compensation.
Additional Notes on Funding Usage
I chatted with ~80% of those who’re likely to be starting forecasting clubs at their universities. After being asked, the three pain points that came up most often were:
funding
content (including instructional content and outreach/promotional content)
and, to a lesser extent, organizational & leadership experience (read: insufficient support/mentorship in that area)
To resolve (1), we’re submitting this funding proposal and distributing funds to clubs. We’re also supporting them financial in other ways, like finding them sponsors — see the section on “Other sources of funding” at the end.
To resolve (2):
we asked Jack Such to help us build instructional content, including a syllabus and a workshop. It’s roughly along the lines of the EA intro fellowship, but with (less content generally, substantially less reading, and substantially more interactive activities — a lot of parts of forecasting can most effectively be grasped through “just playing around with Quantified Intuitions/Manifold/Guesstimate/etc.”
OPTIC (primarily Jingyi Wang) is preparing some basic outreach/promotional materials (posters, templated emails/slack messages, graphics, etc. We’re not requesting funding for her wages for this, since Jingyi’s being paid for her time working at OPTIC generally, and most of the content simply needs to be repackaged, not made from scratch.
To resolve (3), we’re doing a combination of personally mentoring club founders, matching them with experienced cofounders, and finding them relevant mentors in the university EA space.
Who is on your team and what's your track record on similar projects?
Who is on your team?
Saul Munn (me)
Tom Shlomi and Jingyi Wang (co-organizers of OPTIC) are also helping out, but I’m likely to be spending much more time on forecasting clubs than them, because (a) they’re focusing more of their effort than I am on the competitions side of OPTIC more than the clubs side of OPTIC, and (b) I’m probably going to just be spending more time on OPTIC stuff than them.
Jack Such is building instructional content for forecasting clubs to use
Juan Gil is mentoring me
In a different sense, though, most of the “work” that will lead to a better world will be done by forecasting club presidents, not by me — I’m not actually starting any forecasting clubs myself.
What’s your track record on similar projects?
My track record:
Founded OPTIC
Organized the pilot competition for OPTIC (original funding application, postmortem)
Currently organizing OPTIC competitions this fall
Currently organizing Manifest for Manifold
Organized for Brandeis EA for one semester (taking a semester off to focus on OPTIC & Manifest)
Coordinated the BEA Reading Group and helped to facilitate the BEA Introductory Fellowship.
Feel free to reach out to Joseph Pendleton (BEA president) as a reference
Co-organized a 250-person festival for Purim (major Jewish holiday), as well as numerous smaller (15-50+ person) events at my synagogue
Co-president of my high school Jewish Student Union, co-organized monthly 50-200+ person events for two years
Helped out at a few high school debate tournaments of similar size and scope.
Tom Shlomi and Jingyi Wang’s track record are less relevant, but feel free to check out this section for more (and slightly out-of-date) details.
Jack Such is a rising senior studying Business Development at University of Washington in Seattle. He has experience with club-running & organizing from his time as part of the exec team at Model UN in high school, and experience with event-running from his time as a contractor at BTC, Inc. to help run B22, their 50,000-attendee conference.
What are the most likely causes and outcomes if this project fails? (premortem)
club presidents lose interest/motivation
note that this is not a systemic failure — this can affect each club independently. (obviously, there are factors that would affect this systemically, like if i was consistently rude to them; but independent factors seem more likely to cause club presidents to lose interest/motivation, like not having a co-founder, or disengaging with the forecasting community, or being too stressed out in their daily lives, etc.)
outcome:
forecasting clubs are likely to be substantially less effective and/or less promising
we’re significantly less likely to get high quality information on how to start effective and promising clubs in the future
I (Saul) deprioritize forecasting clubs with regard to forecasting competitions, Manifest, and/or other (future?) projects
outcome:
I would return any unused funds.
if you trust my ability to prioritize between impactful projects, this is good. I should deprioritize projects that are less impactful/prioritize projects that are more impactful, and return any resources for non-impactful projects possible.
if I deprioritize clubs, here are some other outcomes:
forecasting clubs are more likely to be less centralized/more varied; less like the current “every EA club is the same” and more like the current “every debate club is different”
forecasting clubs are likely to be less effective and/or less promising
we’re significantly less likely to get high quality information on how to start effective and promising clubs in the future
club presidents have insufficient support
support could include mentorship, instructional content, outreach material, etc
outcome:
clubs would run lower-quality sessions, workshops, etc
some clubs might not survive
something on the theory of impact is wrong
if one of the following is incorrect or ends up failing, we won’t have ended up creating positive impact (failure):
forecasting clubs improve the university-level forecasting space
the university-level forecasting space improves the overall forecasting space
the overall forecasting space improves the world
more insidiously, we could be wrong about some aspect of our theory of impact without our or others’ knowing. If this is the case, we might end up seeming to “succeed,” without having, in expectation, effectively improved the world. This could be especially bad if we request more funding in the future for university forecasting clubs, drawing even more resources from more impactful causes.
What other funding are you or your project getting?
TLDR: Although we do have other funding options, they’re either heavily uncertain or non-ideal in various ways. If it turns out that we don’t need the money, we’ll return it.
Sponsorships
We’ve gotten interest from sponsors (quantitative finance firms), but we’re currently unsure of the answers to the following questions:
How much is value misalignment is a problem? I.e., would sponsors prevent clubs/their participants from doing good?
Would the money that sponsors provide make marginal additional dollars (e.g. from Manifund) significantly less valuable?
We do not expect value misalignment to be a significant problem, but we’ll have more information in the coming weeks as we talk in more depth with potential sponsors. It’s important to note that sponsorship would also provide significant positive signaling — “join my club” means a lot less as a pitch than “join the club that [big quantitative finance firm] sponsors.” Being sponsored provides significant legitimacy to a university club in the eyes of most university students.
We do expect that, to some extent, funding that sponsors would provide make marginal additional dollars from Manifund less valuable. Assuming we get funding from sponsorships and Manifund, we would give the same amount of funding back to Manifund: [funding received from sponsors] = [amount of funding we’d return to Manifund].
Universities
Traditionally, clubs get funding from their universities; this is the case for a lot of clubs at a lot of universities. However, actually getting and using this funding is often a nightmare of bureaucracy, including: ridiculously long waiting periods, low (if any) mentorship/support, intensely earmarked funds, and significantly less funding than is actually effective. This is especially true for new clubs, who might not have established relationships or local knowledge of the red tape that surrounds club funding. This is why many university EA groups get funding from outside their university.
However, we’re still encouraging club presidents to apply for funding for their club; at best, they’ll receive money that could supplement (or replace) funding from Manifund, and at worst, they’ll have wasted a few hours of organizer effort without any funding to show.
OPTIC Competition Funding
This is a subproject of OPTIC, an organization I started with Tom & Jingyi. OPTIC supports intercollegiate forecasting, which has mainly happened through organizing intercollegiate competitions. The budgeting for that is currently separate, but if need be, we might be able to pull some money from the competitions toward clubs. I’m heavily against this, though, because (a) funding for the competitions was meant to fund the competitions, not forecasting clubs, and I’m generally against mixing funds; (b) they access the chain of impact for forecasting somewhat differently, and I could imagine a grantmaker being interested in funding one but not the other; and (c) pulling that was meant for competitions would mean the competitions are worse.
In terms of (b), this is especially true because collegiate forecasting clubs have a theory of change substantially different from collegiate forecasting competitions. Obviously both chains to impact rely on “[↑ forecasting] → [↑ moral good],” but each project (competitions & clubs) have a different way of accessing and a different mechanism for [↑ forecasting].
Jack Such
about 1 year ago
Joel Becker
about 1 year ago
I'm feeling a little skeptical of your theory of change, especially:
Create a larger, more diverse pool of emerging forecasters and superforecasters, which increases the quality of individual & aggregated forecasts.
Two reasons for this:
It doesn't seem like having a larger pool of forecasters is an important bottleneck for use-cases I am aware of. "Regulatory approval" and "acceptance inside prestige institutions" feel like better candidates.
I would guess that university forecasting clubs are a less beneficial means of creating top forecasters than "jobs listing forecasting skill as desired qualification," "excellent public examples of forecasting to emulate" (e.g. Misha's AI bio report). Not sure about cost-effectiveness, though.
I've spent extremely little time reflecting on this, so apologies if the above is confused or otherwise sloppy. Interested in your thoughts!
Joel Becker
about 1 year ago
Oh, and I really appreciate you laying out possible outcomes segmented by percentiles. (I might steal this for my own applications as a future grantee!) I would've slightly preferred you to talk about >95th rather than >75th percentile, but no big deal.
Saul Munn
about 1 year ago
@joel_bkr Hey Joel! Thanks so much for your comments — really appreciate your thoughts. I've answered each of your two reasons for your skepticism below.
TLDR: regulatory approval is overrated as a bottleneck for (most) forecasting use-cases, and creating a large pool of forecasters is largely instrumental to acceptance inside influential decision-making institutions. On potentially more effective ways of creating top forecasters (e.g. job postings, public examples), I think it's important to note that universities offer a fairly unique opportunity to immerse students in a shared experience, culture, etc. Meeting for an hour a week with friends is a totally different level of commitment than learning forecasting because it was listed on a job posting. (Also, Misha himself has chatted with us and is very bullish on university-level forecasting!)
The above TLDR probably covers about 60-70% of the content below. Happy to talk more about these and/or answer any other concerns you might have, either in writing or in a chat! :)
(1) LARGER POOL OF (SUPER)FORECASTERS NOT A BOTTLENECK
It doesn't seem like having a larger pool of forecasters is an important bottleneck for use-cases I am aware of. "Regulatory approval" and "acceptance inside prestige institutions" feel like better candidates.
I think this actually might not be true, or at least not how I'm understanding it.
Re: regulatory approval, this is only the limiting factor for real-money prediction markets. Although it's hugely important in that particular use-case, I think it's important to note that many use-cases of forecasting besides real-money prediction markets do not rely on regulatory approval. Metaculus (and even Manifold!) are great examples of platforms which provide incredible value, and Open Philanthropy's explicit use of forecasting in their grant-making is a great example of decision-makers using forecasting — neither of these required regulatory approval, and regulatory approval would not have improved their impact. If (e.g.) Metaculus had 10x the superforecasters, we would probably have substantially more forecasts on more topics, leading to more accurate forecasts on a wider variety of important areas.
IMHO, real-money prediction markets are unlikely to be the source of the majority of impact from the field of forecasting. Much more likely, it'd come from people identifying talented forecasters and using those individuals in key situations. This is a hotter take, but I do think it's a crux — real-money prediction markets would be great, but in terms of impact, I'd far prefer a widespread Metaculus to a widespread Kalshi.Re: "acceptance inside prestige institutions," I'm not entirely sure what you mean.
If you mean "acceptance from influential decision-makers/key decision-making bodies, like politicians, big NGOs, etc," that makes sense, and I agree!
Note that:current students = future key-decision-makers
acceptance from influential decision-makers would be significantly easier if forecasting was a generally accepted way of doing things (see our 2nd goal under our theory of change)
in order for forecasting to be desirable to influential decision-makers, it needs to first work well — one of the best ways for forecasting to work better is if more people are doing it. This is true both for classic "wisdom of the crowds" reasons, but also because if 10x people are doing forecasting, we'll likely discover 10x superforecasters, and forecasts will be much more accurate, and we'll have forecasts on a wider area of topics, etc.
If, however, you literally mean "acceptance inside prestige institutions," I don't quite agree — I don't think that's a bottleneck to impactful forecasting. Regardless, I do still think college forecasting clubs solve for this — universities are some of the most prestigious institutions in the US, and high-quality clubs (with associated professors, speakers, etc) at said institutions is acceptance.
Another few comments:
Michael Story (Swift Centre) wrote an essay on where forecasting is & isn't useful. Would recommend!
I'm in a bit of a unique position to do something, compared to almost all of the rest of the forecasting community — this sort of thing pretty much only works when its student-lead.
University forecasting seems substantially cheaper to implement than regulatory approval or broad acceptance at prestigious institutions.
This proposal doesn't trade off with regulatory approval or acceptance from prestigious institutions, except with funding efforts. To my knowledge, although there are efforts that could improve the regulatory regime of prediction markets or the institutional acceptance of forecasting, they require social & legal capital, not money (or at least, not money on the order of $8.4k).
(2) OTHER WAYS OF IDENTIFYING TOP FORECASTERS
I would guess that university forecasting clubs are a less beneficial means of creating top forecasters than "jobs listing forecasting skill as desired qualification," "excellent public examples of forecasting to emulate" (e.g. Misha's AI bio report). Not sure about cost-effectiveness, though.
There are a lot of potentially effective approaches, and I think a lot of them
couldshould be tried. The space of possible strategies is huge, and we ought to start trying low-cost stuff and seeing what works and what doesn't.University clubs are in a pretty unique position. They have the opportunity to very significantly influence the life of someone. Students often structure their friend groups around and spend a lot of time in university clubs — this isn't the case with the other means you mentioned.
This is one of the main reasons that university EA groups have been so incredibly popular at building the EA community. I recently chatted with Jessica McCurdy (who runs & started UGAP), and her perspective was (paraphrasing) that university groups are a good if you want students to learn & explore things collaboratively, to make significant changes to their lives, and to group together in a way that allows them to signal-boost a particular idea. All of these apply to forecasting, in a similar way to EA. She was "very excited" about the idea of university forecasting clubs!
Again, I'm in a fairly unique position — my comparative advantage is that I can start clubs, while others can try other strategies (that they might have a comparative advantage in).
E.g. Rethink Priorities might be able to make jobs listings with forecasting skills as a desired qualification, but I probably can't; on the other hand, I can start university forecasting clubs, but they probably can't.
Forecasting clubs are very measurable, compared to some of the strategies you mentioned. 3 forecasting clubs with about 10 active members each (a roughly median outcome) would mean about 30 new forecasters per year, and probably about 50-100 people who've "heard of it" (friends of friends, those who dropped after half a semester, etc). How many people have gotten into forecasting through jobs listings, or public examples of forecasting? It seems pretty hard to say.
Also, Misha Yagudin himself has chatted with us and is very bullish on university-level forecasting!
Just to reiterate from above: thank you for commenting & for your thoughts! I'm happy to talk more about these and/or answer any other concerns you might have, either in writing or in a chat! :)
Misha G
about 1 year ago
I am currently co-leading MIT's Forecasting Group, and I have been working with Saul the last few weeks to help prepare for the semester. I have been running Effective Altruism MIT since Spring '22, and after organizing occasional forecasting workshops with guest speakers and hosting OPTIC's first in-person tournament, we decided to officially found our forecasting student group.
I have found working with Saul/OPTIC very helpful to getting the club started, thinking through our theory of change, and planning our introductory and subsequent meetings. We would mainly use group funding on prizes for local forecasting competitions and food at events. I've observed food and prizes to be a significant factors for college students deciding what events to attend :)
I am happy to meet to discuss our group's plans: https://bit.ly/misha-chat
Saul Munn
over 1 year ago
I’d be happy to chat more about this over a call with any regrantors who’re interested. Book a 25 or 50 minute call here: savvycal.com/saulmunn/optic :)