This document seeks to outline why I feel uneasy about high existential risk estimates from AGI (e.g., 80% doom by 2070).
My highly personal skepticism braindump on existential risk from artificial intelligence
My highly personal skepticism braindump on…
My highly personal skepticism braindump on existential risk from artificial intelligence
This document seeks to outline why I feel uneasy about high existential risk estimates from AGI (e.g., 80% doom by 2070).