3

Help launch the Technical Alignment Research Accelerator (TARA)!

ActiveGrant
$14,901raised
$14,901funding goal
Fully funded and not currently accepting donations.

Summary

This is a grant proposal for $14,901.25 USD in funding to help launch the inaugural Technical Alignment Research Accelerator (TARA). TARA will be an AI Safety Australia & New Zealand (AIS ANZ) led initiative. Our mission is to grow and support a large, ambitious, and influential local community focused on preventing the most harmful impacts of AI. We do this by providing them with information, education, networks and advice. I (Yanni Kyriacos), Co-Founder / Director of AIS ANZ, will lead the project.

The aim is to run TARA every year from February - May, capturing interest from the annual AIS ANZ Careers Conference in November.  

TARA will be a hands-on course based on the Alignment Research Engineer Accelerator (ARENA) curriculum. It will run for 13 weeks (from February to May 2025) and participants are expected to commit 10-15 hours per week to study, which includes meeting in-person with a group in a major ANZ city every Saturday for the duration of the course. While we considered running TARA as an intensive in-person bootcamp like ARENA, the costs of flying and housing participants would increase the budget 5-10x. However, if this first iteration proves successful, future versions of TARA may adopt the traditional bootcamp structure with appropriate funding.

The funding will primarily cover expenses to hire Teaching Assistants ($5,915.00), Google Colab compute credits ($2,437.50, food for Saturday in-person sessions ($3,168.75) and potential room hire ($3,380.00). While we’re hoping to secure free venues through university partnerships, we're budgeting for room hire as contingency. Any unused venue funds will be returned.

The main objective of TARA is to grow the ANZ AI safety community (and movement) by taking motivated and proficient technical talent and upskilling them in safety relevant Machine Learning techniques. It is also important to acknowledge the broader context in which this will occur: it is looking increasingly likely that Australia will launch an AI Safety Institute (AISI), and analogous AISIs appear to preferentially hire technical talent. Staffing the AISI with safety-conscious technical talent will leverage Australia's strategic alliance with the United States to promote robust global AI safety governance.

I will have hands-on support from TARA’s technical organising committee: Ruben Castaing, Lily Stelling, Matt Fisher and Nelson Gardner-Challis. Chris Leong and Ryan Kidd are AIS ANZ advisors and will provide strategic support.

Project Goals (& KPIs)

  1. Develop Technical Talent

    1. Accept 25 participants in inaugural cohort

    2. Achieve 80%+ completion rate

    3. Maintain 80%+ weekly attendance rate

    4. Have >25% of final projects at publication-quality

    5. Achieve >50% transition rate into AI safety roles within 12 months

  2. Build Program Quality

    1. Achieve >4/5 participant satisfaction score

    2. Achieve >4/5 TA satisfaction score

  3. Demonstrate Institutional Credibility

    1. Generate 3+ documented case studies from successful projects

    2. Create repeatable curriculum model for future cohorts

Why run TARA when ARENA already exists?

ARENA is very selective and out of reach for most ANZ talent, as it requires relocating overseas and taking extended time off work. While ARENA maintains high standards, this means significant local talent goes undeveloped. By offering a part-time, locally-based program, TARA can develop this talent pool while participants maintain their careers.

Australia has a strong track record of producing world-class AI safety talent, including Jan Leike (Anthropic), Buck Shlegeris (Anthropic), Dr Ramana Kumar (Prev. Senior Research Scientist on AGI Safety at Google Deepmind) and others who have gone on to lead critical work at major AI safety organizations. TARA aims to build on this legacy while making technical AI safety more accessible to local talent.

Moreover, establishing a rigorous technical AI safety program demonstrates to the Australian government that the local AI safety community is serious, organized, and capable of producing high-caliber technical talent. This credibility will be particularly valuable as Australia likely develops its AI Safety Institute and looks to the local community for expertise and staffing.

While initially based on ARENA's proven curriculum, TARA will have the flexibility to expand its technical content in future iterations as AI safety research evolves.

Target audience

  1. Software engineers and ML practitioners transitioning into AI safety

  2. Postgraduate computer science / mathematics students from institutions such as The Australian National University, The University of Sydney, The University of Melbourne and The University of Queensland. 

  3. Technical professionals who can't commit to full-time programs like ARENA

  4. Local technical talent likely interested in working at the future AISI

In May 2024, AIS ANZ conducted interviews (and a follow-up survey) with 21 people considered high-prospect technical talent in ANZ. TARA aims to address five out of their top ten barriers to impact (highlighted in blue).

A Likert scale was used to measure the intensity of each barrier;

  • Not at all (0)

  • Slightly (1)

  • Somewhat (2)

  • Moderately (3)

  • Very (4)

  • Extremely (5)

Entry Requirements

  1. Strong Python programming background

  2. Basic understanding of deep learning concepts

  3. Working knowledge of linear algebra, probability and statistics

  4. Ability to commit to 10-15 hours per week for 13 weeks

  5. Ability to attend weekly in-person sessions

  6. Located in or able to travel to major ANZ cities 

High level timeline

  • 2024

    • End of November: Funding approved, begin TA recruitment

    • December: Secure locations in Sydney, Melbourne and Brisbane

    • December: Complete TA recruitment

    • December: Participant applications open

  • 2025

    • Mid January: Applications close

    • Late January: Participant selection and notification

    • Early February: Onboarding and setup

    • Mid February: Program launch

    • February-April: Core curriculum

    • May: Final projects and presentations

    • June: Program evaluation and planning for next cohort

Which cities will it run in?

TARA will likely initially operate in Australia's major east coast hubs - Sydney, Melbourne, and Brisbane - where approximately 50% of the region's technical talent is concentrated. Each city will have a dedicated location for in-person sessions.

Additional satellite locations in other Australian and New Zealand cities may be established based on applicant strength and geographic clustering. Even small groups of 2 exceptional applicants in cities like Adelaide, Perth, Wellington, or Auckland could warrant establishing a local cohort. This flexible approach ensures we don't miss out on strong talent while maintaining program quality.

TARA Teaching Structure

The TA role is critical to TARA's success and will be our first priority once funding is secured. We plan to support up to six learning groups of maximum six participants each, with flexibility in TA arrangements based on candidate availability.

Staffing Models (in order of preference):

  1. Primary Model: Single Lead TA

    1. One experienced TA oversees all groups for the full program duration

    2. Ensures consistency in teaching

    3. Optimal for program coordination

    4. While challenging to staff, this is our preferred approach

  1. Modular Model: Rotating TAs

    1. Different TAs cover specific course modules

    2. Leverages specialised expertise for different topics

    3. Provides flexibility for TAs with limited availability

TARA Group Structure

  • Maximum 6 participants per learning group

  • Maximum 6 learning groups total

  • Smaller satellite locations (2-3 participants) will be merged virtually

  • Each group receives about 1.5 hours of dedicated TA time per week

  • Mixed in-person and virtual support based on geographic distribution

Weekly Learning Structure

Each week is structured to maximize learning while accommodating participants' work commitments:

  1. Independent Learning (3-4 hours)

    1. Self-paced study of core concepts

    2. Preparation exercises for Saturday sessions

  2. Saturday In-Person Sessions (6-8 hours)

    1. Intensive paired programming sessions

    2. Hands-on implementation of concepts

    3. Direct access to Teaching Assistants

    4. Group discussion and problem-solving

  3. Weekly Support

    1. Ongoing support via Slack throughout the week

    2. Optional peer study groups

    3. Weekly homework review and feedback

Course Timeline

Venue Details

While we’re hoping to secure free venues through university partnerships, we're budgeting for room hire as contingency. Any unused venue funds will be returned

Budget Breakdown (USD)

Prior Experience/Track Record

Since founding AI Safety Australia & New Zealand, we have rapidly established ourselves as an important AI safety organisation in the region, demonstrating strong execution and community building. Over the past six months, we accomplished the following:

  • Major Events & Programs

    • Sold out our inaugural AI Safety Careers Conference 4.5 weeks before the event date, featuring 15 speakers across technical and policy tracks, with 90 attendees RSVPd

    • Successfully launched and sustained monthly meetups across five major cities (Sydney, Melbourne, Brisbane, Canberra, Wellington), engaging 50+ community members weekly

    • Partnered with the Good Ancestors Project to facilitate community input into government AI safety inquiries

  • Community Building & Professional Development

    • Built and led a team of 10 dedicated volunteers who help run our growing organisation

    • Established a career planning program

    • Created and facilitated online networking events

    • Built robust community channels with 550+ Facebook group members, 340+ newsletter subscribers, and 110+ LinkedIn followers

  • Strategic Initiatives

    • Submitted formal recommendations to government AI safety inquiries

    • Incubated multiple successful projects including a Governance Newsletter and specialised working groups

I (Yanni Kyriacos) also have direct experience designing and running structured educational programs. While serving as Associate Strategy Director at Edge Marketing Agency, I developed and led an intensive 8-week internship program for six emerging students from diverse backgrounds including design, strategy, marketing, content, business development, and account management. I personally managed the full lifecycle of the program - from recruitment through to final delivery - including pairing interns with appropriate mentors and securing a real-world client brief from RESULTS Australia. Under my mentorship, the cohort successfully collaborated to deliver meaningful work for an established non-profit organization, demonstrating my ability to guide diverse groups through complex learning experiences while delivering tangible value to stakeholders.

This track record demonstrates our ability to:

  1. Execute large-scale technical programs

  2. Build and maintain engaged communities

  3. Connect with key stakeholders across academia, industry, and government

  4. Identify and develop talent in the AI safety space

donated $14,901
RyanKidd avatar

Ryan Kidd

about 1 month ago

Main points in favor of this grant

  1. ARENA is a great curriculum and Yanni’s adaption to a weekend course seems well-considered.

  2. I’ve visited several Australian AI safety and EA conferences this year (and for the past 5 years) and I think that the local talent base is quite strong. MATS has accepted ~12 Australian scholars (~4% of all alumni) and they did great!

  3. Talking with several students and professional software engineers in Australia at recent conferences has convinced me that there is value in a weekend version of the ARENA course for people who want to upskill in AI safety research engineering, but cannot take off work and travel to the UK for a month to do ARENA.

  4. I’ve been impressed by Yanni’s operations and marketing ability in regards to the AI Safety ANZ Careers Conference 2024 and AI Safety ANZ’s recent seminar/workshop series.

  5. Australians can get work visas for the UK and USA (E-3) relatively easily, which means they can emigrate to the SF Bay Area and London for AI safety work.

Donor's main reservations

  1. Yanni hasn’t yet found a good TA to run this course, which seems important for this to go well. I expect this to be much easier with funding, however, and I’m confident Yanni can find someone qualified, probably from the pool of MLAB and ARENA alumni.

  2. As with all training programs that empower ML engineers and researchers, there is some concern that alumni will work on AI capabilities rather than safety. Therefore, it’s important to select for value alignment in applicants, as well as technical skill.

Process for deciding amount

I decided to fund the entire project because it’s quite cheap for the scale ($459-596/participant) and I expect I have a stronger and more informed opinion of the Australian talent pool than other funders. Also, I trust Yanni to refund the extra $3419 for room bookings if he is unable to secure free event space.

Conflicts of interest

I am an Australian citizen and have personal and professional ties to Australian AI safety and EA communities. Yanni recently invited me to present at the AI Safety ANZ Careers Conference 2024, although I received no financial compensation. I am an Advisor to AI Safety ANZ, Yanni’s organization, though I receive no financial compensation for this role. This year, I presented at the Australian AI Safety Forum 2024 and I have previously presented at EAGxAustralia 2022 and 2023, for which I received travel and accommodation funding.

donated $14,901
RyanKidd avatar

Ryan Kidd

about 1 month ago

@RyanKidd *Also, I trust Yanni to refund the extra $3419 for room bookings if he is able to secure free event space.