Alignment Ecosystem Development

There are many joinable or foundable projects which can improve the alignment ecosystem. Let's build some.

Featured

AISafety.com

AISafety.com is being rebuilt into a new homepage for AI Safety on the AED Discord.

AI Safety Ideas

Databases of research ideas towards alignment, with features for fostering collaboration and connecting teams.

Forum Magnum

The open-source codebase that the AI Alignment Forum, Effective Altruism Forum, and LessWrong run on, supporting alignment discussion.

AI Safety Info

aisafety.info is a single point of access for learning about AGI safety created by Rob Miles's volunteer team. Volunteers can distill content into our database of questions and answers, or help code the React/Remix UI or python bot.

EleutherAI

EleutherAI is a grassroots collective of researchers working to open source AI research projects. They have many active alignment channels.

ReadingWhatWeCan

Reading What We Can is a reading list designed to upskill people rapidly on alignment topics.

AI Safety GiveWiki

Retroactive public goods markets have been identified by ACX and Yudkowsky as an important philanthropic innovation.

EigenTrust Network

EigenTrust is a mechanism for scaling trust by allowing individuals to leverage their network's combined experience. First, peer vetting of alignment research contributions at scale. Then, the world!

Propose new project

Join our Discord, then start a new thread in #project-ideas to bring it to the attention of our devs! You can also create an event on the Discord to pitch it by voice, or join our weekly calls.


Suggestions, fixes, or queries?


© plex.ventures All rights reserved.

Previously featured

AI Safety Homepage

A homepage for AI safety, linking out to the relevant parts of the ecosystem.