I’d like to get deeper into the AI alignment literature. Where should I look?

The AI Safety Fundamentals course is a great way to get up to speed on alignment; you can apply to go through it together with other students and a mentor, but note that they reject many applicants. You can also read their materials independently, ideally after finding a group of others to take the course together with.

Other great ways to explore:

You might also consider reading Rationality: A-Z, which covers skills that are valuable to acquire for people trying to think about complex issues.



AISafety.info

We’re a global team of specialists and volunteers from various backgrounds who want to ensure that the effects of future AI are beneficial rather than catastrophic.