Stephen Hawking’s final writings have been assembled into a new book to be published this week. It contains a warning about the rise of ‘super-rich superhumans’.
Stephen Hawking’s final writings have been assembled into a new book to be published this week. It contains a warning about the rise of ‘super-rich superhumans’.

Stephen Hawking just killed God

STEPHEN Hawking has spoken from beyond the grave. There is no god, he says. There is no afterlife. There is only wishful thinking - or taking control of our own future.

Hawking had a lot of time to think. Trapped in his steadily failing body, he let his mind roam the expanse of the universe, the future of humanity - and the existence of an afterlife.

He formed some very strong opinions about all of them.

And when it comes to god, it's based on personal experience.

"There is no God. No one directs the universe," he writes in Brief Answers to the Big Questions - an assembly of papers he was writing in the months leading up to his death.

"For centuries, it was believed that disabled people like me were living under a curse that was inflicted by God," he adds. "I prefer to think that everything can be explained another way, by the laws of nature".

Hawking had questioned the existence of deities before. But this time he was more resolute. He wrote shortly before he died that he had come to the 'profound realisation' there was no such thing as supreme beings or an afterlife.

"We are each free to believe what we want, and it's my view that the simplest explanation is that there is no God," he says.

"No one created the universe and no one directs our fate. This leads me to a profound realisation: there is probably no heaven and afterlife either."

At the launch of his book, Hawking's computer-generated voice read aloud extracts.

His daughter, Lucy Hawking, said: "It was very emotional. I turned away because I had tears forming in my eyes. "I feel sometimes like he's still here because we talk about him and we hear his voice and we see images of him, and then we have the reminder that he's left us."

Hawking himself remained pragmatic to the end.

"I think belief in the afterlife is just wishful thinking.

"There is no reliable evidence for it, and it flies in the face of everything we know in science. I think that when we die we return to dust. But there is a sense we live on, in our influence, and in the genes we pass to our children."

Instead, he placed his hopes in humanity itself.

"Remember to look up at the stars and not down at your feet. Try to make sense of what you see, and wonder about what makes the universe exist."

LOOMING CHALLENGES

Hawking was confident: "We will transcend the Earth and learn to exist in space".

But not without challenges.

His final words reiterate his fear that artificial intelligence will outmatch our own in less than 100 years. And, once their ambitions begin to divide from our own, that could create a crisis that threatens humanity's very survival.

But he had another ominous warning.

Civilisation as we know it is on the brink of being overrun by super-rich super-humans.

We have the technology.

It just didn't arrive in time to save the 76-year-old theoretical physicist himself.

But the looming potential for humans to re-engineer themselves occupied much of Hawking's thoughts in his last days. He put them down amid a collection of articles addressing what he called "the big questions" facing our future.

They will be posthumously published later this week in Brief Answers to the Big Questions.

SUPERHUMAN

"I am sure that during this century people will discover how to modify both intelligence and instincts such as aggression," he writes.

And that poses a problem.

The technology is bounding ahead.

The DNA editing system CRISPR was only invented in 2012.

It allows defective strands of DNA to be cut out and replaced. It also allows DNA modules which control characteristics, such as those involving our eyes, to be replaced with enhanced versions.

It already has Australia's defence force thinking of ways to enhance the concentration, awareness, strength, endurance and health of its soldiers.

And the race towards providing immortality - at a price - is gathering pace.

Human nature, Hawking says, makes it inevitable those seeking an 'edge' for themselves, or their children, will abuse this technology.

And those best placed to do this will have lots of money.

SUPERHUMANS: The biotechnology revolution is here

"Laws will probably be passed against genetic engineering with humans," he writes. "But some people won't be able to resist the temptation to improve human characteristics, such as memory, resistance to disease and length of life."

Western society is built around the concept of progress, individual rights and competition.

So what happens if one already advantaged group in society benefits from a playing field tilted even further in their favour?

"Once such superhumans appear, there will be significant political problems with unimproved humans, who won't be able to compete," Hawking writes.

He also fears a return to the fascist world of Nazi Europe, where selective breeding - known as eugenics - was used in a bid to create a 'master race'.

 

We have the technology. But do we have the wisdom? Professor Stephen Hawking feared the implications of AI and artificially-enhanced humans. Picture: Getty
We have the technology. But do we have the wisdom? Professor Stephen Hawking feared the implications of AI and artificially-enhanced humans. Picture: Getty

 

WISDOM, NOT INTELLIGENCE

It's not the first time Hawking has warned humanity about the potentially dark paths that lay ahead.

He was a loud voice in the debate about the implications of artificial intelligence.

Hawking made it clear he was afraid that cold, calculating machine logic could lead them to take charge 'for our own good'.

"The development of full artificial intelligence could spell the end of the human race," he warned in 2014.

Hawking did not deny AI's usefulness. He did, after all, use one himself to help translate his thoughts into words.

But that usefulness was itself its most significant threat, he warned. In the ever increasing race to produce better, faster, more self-learning intelligence than those of competitors - we could create a monster.

"It would take off on its own, and redesign itself at an ever-increasing rate," Hawking said.

"Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."