Risk of astronomical suffering

Risks of astronomical suffering, also called suffering risks or s-risks, are risks involving much more suffering than all that has occurred on Earth so far.[3][4]
According to some scholars, s-risks warrant serious consideration as they are not extremely unlikely and can arise from unforeseen scenarios. Although they may appear speculative, factors such as technological advancement, power dynamics, and historical precedents indicate that advanced technology could inadvertently result in substantial suffering.[5]
Sources of possible s-risks include advanced artificial intelligence,[6][7] space colonization[8] and the spread of wild animal suffering to other planets.[9]
S-risks can be intentional, driven by factors like tribalism, sadism or a desire for retribution; or incidental, for example arising as a byproduct of economic incentives.[10]
Sources of s-risk
[edit]Artificial intelligence
[edit]Artificial intelligence is central to s-risk discussions because it may eventually enable powerful actors to control vast technological systems. In a worst-case scenario, AI could be used to create systems of perpetual suffering, such as a totalitarian regime expanding across space.[11] Additionally, s-risks might arise incidentally, such as through AI-driven simulations of conscious beings experiencing suffering, or from economic activities that disregard the well-being of nonhuman or digital minds.[4] Steven Umbrello, an AI ethics researcher, has warned that biological computing may make system design more prone to s-risks.[6]
Space colonization
[edit]Space colonization could increase suffering by introducing wild animals to new environments, leading to ecological imbalances. In unfamiliar habitats, animals may struggle to survive, facing hunger, disease, and predation. These challenges, combined with unstable ecosystems, could cause population crashes or explosions, resulting in widespread suffering. Additionally, the lack of natural predators or proper biodiversity on colonized planets could worsen the situation, mirroring Earth’s ecological problems on a larger scale. This raises ethical concerns about the unintended consequences of space colonization, as it could propagate immense animal suffering in new, unstable ecosystems. Phil Torres argues that space colonization poses significant "suffering risks", where expansion into space will lead to the creation of diverse species and civilizations with conflicting interests. These differences, combined with advanced weaponry and the vast distances between civilizations, would result in catastrophic and unresolvable conflicts. Strategies like a "cosmic Leviathan" to impose order or deterrence policies are unlikely to succeed due to physical limitations in space and the destructive power of future technologies. Thus, Torres concludes that space colonization could create immense suffering and should be delayed or avoided altogether.[12]
Magnus Vinding's "astronomical atrocity problem" questions whether vast amounts of happiness can justify extreme suffering from space colonization. He highlights moral concerns such as diminishing returns on positive goods, the potentially incomparable weight of severe suffering, and the priority of preventing misery. He argues that if colonization is inevitable, it should be led by agents deeply committed to minimizing harm.[13]
Genetic engineering
[edit]David Pearce has argued that genetic engineering is a potential s-risk. Pearce argues that while technological mastery over the pleasure-pain axis and solving the hard problem of consciousness could lead to the potential eradication of suffering, it could also potentially increase the level of contrast in the hedonic range that sentient beings could experience. He argues that these technologies might make it feasible to create "hyperpain" or "dolorium" that experience levels of suffering beyond the human range.[14]
Excessive criminal punishment
[edit]
S-risk scenarios may also arise intentionally. Such risks escalate in situations such as warfare or terrorism, especially when advanced technology is involved, as conflicts can amplify destructive tendencies like sadism, tribalism, and retributivism. War often intensifies these dynamics, with the possibility of catastrophic threats being used to force concessions. Agential s-risks are further aggravated by malevolent traits in powerful individuals, such as narcissism or psychopathy. This is exemplified by totalitarian dictators like Hitler and Stalin, whose actions in the 20th century inflicted widespread suffering.[10]
Mitigation strategies
[edit]To mitigate s-risks, efforts focus on researching and understanding the factors that exacerbate them, particularly in emerging technologies and social structures. Targeted strategies include promoting safe AI design, ensuring cooperation among AI developers, and modeling future civilizations to anticipate risks. Broad strategies may advocate for moral norms against large-scale suffering and stable political institutions. According to Anthony DiGiovanni, prioritizing s-risk reduction is essential, as it may be more manageable than other long-term challenges, while avoiding catastrophic outcomes could be easier than achieving an entirely utopian future.[15]
See also
[edit]References
[edit]- ^ Bostrom, Nick (2013). "Existential Risk Prevention as Global Priority" (PDF). Global Policy. 4 (1): 15–3. doi:10.1111/1758-5899.12002. Archived (PDF) from the original on 2014-07-14. Retrieved 2024-02-12 – via Existential Risk.
- ^ Baumann, Tobias (2017). "S-risk FAQ". Center for Reducing Suffering. Archived from the original on 2023-07-09. Retrieved 2023-09-14.
- ^ Daniel, Max (2017-06-20). "S-risks: Why they are the worst existential risks, and how to prevent them (EAG Boston 2017)". Center on Long-Term Risk. Archived from the original on 2023-10-08. Retrieved 2023-09-14.
- ^ a b Hilton, Benjamin (September 2022). "'S-risks'". 80,000 Hours. Archived from the original on 2024-05-09. Retrieved 2023-09-14.
- ^ Baumann, Tobias (2017). "S-risks: An introduction". centerforreducingsuffering.org. Retrieved 19 October 2024.
- ^ a b Umbrello, Steven; Sorgner, Stefan Lorenz (June 2019). "Nonconscious Cognitive Suffering: Considering Suffering Risks of Embodied Artificial Intelligence". Philosophies. 4 (2): 24. doi:10.3390/philosophies4020024. hdl:2318/1702133.
- ^ Sotala, Kaj; Gloor, Lukas (2017-12-27). "Superintelligence As a Cause or Cure For Risks of Astronomical Suffering". Informatica. 41 (4). ISSN 1854-3871. Archived from the original on 2021-04-16. Retrieved 2021-02-10.
- ^ Torres, Phil (2018-06-01). "Space colonization and suffering risks: Reassessing the "maxipok rule"". Futures. 100: 74–85. doi:10.1016/j.futures.2018.04.008. ISSN 0016-3287. S2CID 149794325. Archived from the original on 2019-04-29. Retrieved 2021-02-10.
- ^ Kovic, Marko (2021-02-01). "Risks of space colonization". Futures. 126: 102638. doi:10.1016/j.futures.2020.102638. ISSN 0016-3287. S2CID 230597480.
{{cite journal}}
: CS1 maint: article number as page number (link) - ^ a b c "A Typology of S-Risks". Center for Reducing Suffering. Retrieved 19 October 2024.
- ^ Minardi, Di (16 October 2020). "The grim fate that could be 'worse than extinction'". BBC. Retrieved 2025-01-11.
- ^ Phil Torres (2018). "Space colonization and suffering risks: Reassessing the "maxipok rule"". Futures. 103: 144–154. doi:10.1016/j.futures.2018.04.008. Retrieved 19 October 2024.
- ^ Vinding, Magnus (2020). Suffering-Focused Ethics: Defense and Implications (PDF). Ratio Ethica. ISBN 9798624910911.
- ^ Pearce, David. "Quora Answers by David Pearce (2015 - 2024) : Transhumanism with a human face".
- ^ "A Beginner's Guide to Reducing S-Risks". Longtermrisk.org. 5 September 2023. Retrieved 25 October 2024.
Further reading
[edit]- Baumann, Tobias (2022). Avoiding the Worst: How to Prevent a Moral Catastrophe. Independently published. ISBN 979-8359800037.
- Metzinger, Thomas (2021-02-19). "Artificial Suffering: An Argument for a Global Moratorium on Synthetic Phenomenology". Journal of Artificial Intelligence and Consciousness. 08: 43–66. doi:10.1142/S270507852150003X. ISSN 2705-0785.
- Minardi, Di (2020-10-15). "The grim fate that could be 'worse than extinction'". BBC Future. Retrieved 2021-02-11.
- Baumann, Tobias (2017). "S-risks: An introduction". Center for Reducing Suffering. Retrieved 2021-02-10.
- Althaus, David; Gloor, Lukas (2016-09-14). "Reducing Risks of Astronomical Suffering: A Neglected Priority". Center on Long-Term Risk. Retrieved 2021-02-10.