Hawking argues that this statistical near-certainty of doom over the next few centuries will most likely be self inflicted. Of the most likely candidates he singles out are the usual suspects: nuclear war, global warming and genetically engineered superviruses.
Previously, Hawking has also warned about advances in artificial intelligence turning against humanity - a fear that is shared by other prominent figures, including SpaceX and Tesla founder Elon Musk.
ANALYSIS: Hawking: Great Scientist, Bad Gambler
But there is hope, fortunately.
"We are not going to stop making progress, or reverse it, so we have to recognize the dangers and control them," he said. "I'm an optimist, and I believe we can."
It may not sound very optimistic to be identifying all the weird and wonderful ways we'll likely be made extinct, but with this knowledge perhaps we can avert the worst cataclysms we could inflict on ourselves in the future. Failing that, argues Hawking, we should really focus our efforts on space exploration and spreading humanity to the solar system and (possibly) beyond. The Earth may be doomed, but humanity needn't be.