You're pledging to donate if the project hits its minimum goal and gets approved. If not, your funds will be returned.
Project summary
I am seeking support to attend UNIDIR’s Global Conference on AI, Security and Ethics 2026 in Geneva as an in-person participant. Although my abstract was not selected for the formal programme, I was explicitly invited to attend the conference in person. I want to use this opportunity to deepen my work on AI governance and safety, expand my network, engage directly with key stakeholders, and contribute to ongoing discussions on how advanced AI should be governed.
I am applying for support to attend UNIDIR’s Global Conference on AI, Security and Ethics 2026 in Geneva as an in-person participant.
I submitted an abstract to the conference. While it was not selected for inclusion in the formal programme, UNIDIR explicitly welcomed me to attend in person in Geneva as a participant. I take that seriously, because this is one of the few forums where AI governance, security, ethics, and international coordination are discussed together by researchers, policymakers, practitioners, and institutional actors.
My purpose in attending is not only to listen, but to participate actively in the wider conversation around advanced AI governance and safety. I am working on questions related to runtime governance, control, safety, standards, and the problem of how advanced AI systems can remain governable once they are already live in real environments. This conference is directly relevant to that work.
I want to use the opportunity in three ways. First, to deepen my understanding of how different stakeholders currently think about AI security, international governance, standards, and risk. Second, to expand my network by meeting people from policy, multilateral institutions, research, and practice who are shaping the field. Third, to contribute to the conversation itself through direct exchanges with participants, so that my own work is better informed and my perspective can also help influence how some of these questions are framed going forward.
The conference matters because AI governance is no longer only a technical or academic issue. It is becoming an institutional, geopolitical, and societal issue. That means it is increasingly important to be in the room, speak with the right people, and build relationships across different stakeholder groups. For me, this would be a high-leverage opportunity to strengthen both the substance of my work and the network around it.
The first goal is to attend UNIDIR’s AISE 2026 conference in person and engage directly with one of the most relevant international discussions on AI security, ethics, and governance.
The second goal is to expand my network among policymakers, researchers, practitioners, and other stakeholders working on AI governance, standards, and safety. I want to use the conference not only as a learning opportunity, but as a place to build lasting relationships that can matter for future collaboration, exchange, and visibility.
The third goal is to strengthen my own work by testing ideas, listening carefully to different perspectives, and participating in discussions that can sharpen my understanding of where the field is moving.
The fourth goal is to contribute to the discussion itself. I do not see participation as passive attendance. I want to speak with people across sectors, exchange views, and help shape how questions of AI governance and safety are understood, especially around operational governance, control, and responsibility.
I will achieve this by attending the full conference in Geneva, preparing in advance which participants and stakeholder groups I most want to engage, participating actively in discussions, and following up after the conference with key contacts and lessons.
This funding would be used to cover part of the cost of attending the conference in Geneva in person.
The grant would go toward airfare, accommodation, local transportation, meals, and directly related travel costs. The request is modest, but it would materially improve my ability to attend, participate fully, and make proper use of the opportunity.
Airfare: $500
Accommodation: $550
Meals: $250
Local transport: $100
Travel-related incidentals and contingency: $100
Total: $1,500
This application is for an individual trip connected to my broader work in AI governance and safety.
My name is Pedro Bentancour Garin. I have an interdisciplinary background spanning engineering, political science, philosophy, and doctoral-level work in Ancient History and Classical Archaeology. Over time, my work has increasingly focused on governance, institutions, ethics, and the long-term control of powerful systems.
I have built a growing presence in the AI governance and safety space through writing, analysis, and direct engagement. I have participated in seminars and discussions connected to the UN and ITU, and I have also been invited to meetings with staff connected to the EU AI Office and U.S. congressional circles. That has given me a meaningful foundation, but I want to deepen and broaden that network further.
Earlier in my career, I founded Treehoo, an early sustainability-focused internet platform that reached users in more than 170 countries and became a finalist at Globe Forum in Stockholm alongside companies such as Tesla.
My academic work has been supported by more than 15 competitive research grants, including funding from the Royal Swedish Academy of Sciences, and research stays at institutions such as Oxford, the Getty Center, the University of Melbourne, and the Vatican.
This conference would help me connect my existing experience and network more directly to the current international AI governance discussion.
The main risk is simply that I am unable to attend in person without financial support.
If that happens, the project does not collapse, but I lose an unusually concentrated opportunity to engage directly with relevant stakeholders in one place. I would lose both the learning value and the networking value, and I would miss the chance to contribute in person to discussions that are directly relevant to my work.
So the downside is mainly lost leverage. The upside is the opposite: stronger relationships, sharper understanding, greater visibility in the field, and better-informed work going forward.
$0.
This trip has not received external funding to date.