Hello, is this planet Earth?
The annual risk of global catastrophe surpasses 0.2% ESA/ NASA

There is a one in 500 probability that the human race could be wiped out by the end of 2017, a leading mathematician has calculated. The possibility of nuclear war, global warming and artificial intelligence (AI) have all been discussed as potentially lethal for humans, and University of Barcelona statistician Fergus Simpson says there is a 0.2% chance of an apocalypse taking place in any given year of this century.

Simpson based his calculations on the so-called Doomsday Argument, also known as the "Carter catastrophe" after Brandon Carter, the astrophysicist who first proposed it in 1983. The Doomsday Argument is a probability-based theory that uses the number of humans born so far to predict the number still to be born. Simpson calculates around 100 billion have already lived, and so if only another 100 billion will be born, the race must be halfway through its lifespan.

Simpson used football as an analogy for how he made his calculation. "Our key conclusion is that the annual risk of global catastrophe currently exceeds 0.2%," he wrote in a paper published on Arvix. "In a year when Leicester City FC were crowned Premier League champions, we are reminded that events of this rarity can prove challenging to anticipate, yet they should not be ignored."

According to the Daily Mail, Simpson is particularly concerned by the possibility of nuclear annihilation, particularly if countries including North Korea build fully-functioning nuclear missiles.

"When at least eight sovereign states are in possession of nuclear weapons (including one whose leader has executed members of his own family), a head-in-the-sand approach appears both dangerous and irresponsible."

However, there is some good news for people. Simpson calculates there is an 87% chance the human race will survive to at least 2100.

The announcement comes after Professor Stephen Hawking said humans must find a new home within 1,000 years as the earth is doomed. Hawking has mentioned AI as a potential extinction level event (ELE) and has said we must be ready to colonise other planets as an insurance policy in case our planet becomes uninhabitable.