AI Apocalypse. TREATY

1 month ago
182

All experts agree that the chances of an AI existential threat Is at least as dangerous as nuclear apocalypse. With this in mind I propose tying the creation of AGI to the M.A.D doctrine Mutually assured destruction. The creation of an artificial intelligence o Above a certain threshold will be considered a nuclear first strike and will be treated as such. This treaty has saved us from nuclear war for over 70 years. I propose we use the same wording and methodology to ensure a safe future with AI

Be sure to do your shopping at The Alex Jones Store with the link below 👇
https://thealexjonesstore.com/Theonly1jeremy

https://www.banned.video/

Slave2liberty

Loading comments...