Stephen Hawking is really afraid of the future…
Astrophysicist Professor Stephen Hawking has warned that the human civilization is entering the most dangerous 100 years in its history and faces extinction thanks to man-made threats such as artificial intelligence (AI), human aggression, and aliens [yes, you read it right].
While Hawking believes technology can ensure mankind’s survival, he simultaneously warns further developing AI could prove a fatal mistake. In a lengthy Q&A session on Reddit, Hawking explained how AI is humanity’s biggest existential threat:
If our machines don’t kill us, we might kill ourselves, predicts the cosmologist. During a tour of London’s Science Museum, warning that a major nuclear war would be the end of human civilization Hawking urged people to be more empathetic given that aggression is the human race’s biggest failing which threatens to destroy the human race.
The Independent quoted the scientist as saying:
Intelligence and aggression have the capacity to destroy us, but what does this mean for humanity’s chances of getting destroyed by aliens? Since past few years, Hawking has warned that if an intelligent, more advanced alien civilization exists, it would not be friendly to mistreating, less technologically advanced humans, and would have no problem in conquering and colonizing the planet and eventually wiping out the human race. In April 2010, Hawking noted:
During a media event at the Royal Society in London in July 2015, Hawking voiced his fears again:
This article (Stephen Hawking Warns: Robots, Nuclear War & Aliens Will Wipe Out Humanity In Less Than 100 Years) is free and open source. You have permission to republish this article under a Creative Commons license with attribution to the author and AnonHQ.com.