Stephen Hawking puts a deadline for founding space colonies
While delivering a lecture at the Oxford Union debate society, celebrated physicist and author Professor Stephen Hawking also issued a dire warning to humankind.
Hawking warned that humanity would not survive another 1,000 years unless it founded colonies beyond Earth.
This isn't the first time that the physicist has painted a bleak picture of the future.
We take a look at his other warnings.
You May Also Like
A wary mind: Stephen Hawking's warnings
Hawking clueless about the question of humanity's survival
In 2006, Professor Stephen Hawking turned to the internet to seek answers to the question of humanity's survival.
Hawking posed an open question to the online community, which read, "In a world that is in chaos politically, socially and environmentally, how can the human race sustain another 100 years?"
In answer to his own question, Hawking replied, "I don't know the answer".
Stephen Hawking warns against making contact with aliens
In 2010, in a series for the Discovery Channel, Professor Stephen Hawking said it was "perfectly rational" to assume that intelligent life exists, considering the vastness of the universe.
However, Hawking said that humanity should do everything possible to avoid contact with aliens.
"If aliens visit us, the outcome would be much as when Columbus landed in America," the physicist said.
Love Tech news?
Stay updated with the latest happenings.
Hawking warns against artificial intelligence
In 2014, Stephen Hawking, when asked a question about an improvement to the basic AI with which he communicates, warned, "the development of full artificial intelligence could spell the end of the human race".
Hawking said that the primitive forms of AI which have already been developed have been useful for humankind.
Yet, researchers should be wary of creating AI which surpasses humanity itself.
Stephen Hawking warns of man-made catastrophes
In January 2016, Stephen Hawking warned that a catastrophe on Earth in the next 1,000 years was a "near certainty".
The physicist said that unforeseeable and unstoppable advances in science and technology posed the greatest danger, adding that nuclear war, global warming and genetically engineered viruses could well cause a crippling disaster for the human race.