Resources
In 2023, hundreds of AI experts and executives signed the following statement:
"Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war."
AI safety is a growing field working to prevent harms from AI. Below are some key resources to help you learn more about the challenges posed by advanced AI systems and the ongoing efforts to mitigate these risks.
Learn More
Website: AISafety.com
Video: Could AI wipe out humanity?, 80,000 Hours
Article: AI Safety, Wikipedia
Article: Preventing an AI-related catastrophe, 80,000 Hours
Paper: Is Power-Seeking AI an Existential Risk?, J. Carlsmith