The co-founder of ChatGPT maker OpenAI proposed building a doomsday bunker that would house the company’s top researchers in case of a “rapture” triggered by the release of a new form of artificial intelligence that could surpass the cognitive abilities of humans, according to a new book.

Ilya Sutskever, the man credited with being the brains behind ChatGPT, convened a meeting with key scientists at OpenAI in the summer of 2023 during which he said: “Once we all get into the bunker…”

A confused researcher interrupted him. “I’m sorry,” the researcher asked, “the bunker?”

“We’re definitely going to build a bunker before we release AGI,” Sutskever replied, according to an attendee.

The plan, he explained, would be to protect OpenAI’s core scientists from what he anticipated could be geopolitical chaos or violent competition between world powers once AGI — an artificial intelligence that exceeds human capabilities — is released.

“Of course,” he added, “it’s going to be optional whether you want to get into the bunker.”

The exchange was first reported by Karen Hao, author of the upcoming book “Empire of AI: Dreams and Nightmares in Sam Altman’s OpenAI.”

An essay adapted from the book was published by The Atlantic.

The bunker comment by Sutskever wasn’t a one-off. Two other sources told Hao that Sutskever had regularly referenced the bunker in internal discussions.

One OpenAI researcher went so far as to say that “there is a group of people — Ilya being one of them — who believe that building AGI will bring about a rapture. Literally, a rapture.”

Though Sutskever declined to comment on the matter, the idea of a secure refuge for scientists developing AGI underscores the extraordinary anxieties gripping some of the minds behind the most powerful technology in the world.

Sutskever has long been seen as a kind of mystic within OpenAI, known for discussing AI in moral and even metaphysical terms, according to the author.

At the same time, he’s also one of the most technically gifted minds behind ChatGPT and other large language models that have propelled the company into global prominence.

In recent years, Sutskever had begun splitting his time between accelerating AI capabilities and promoting AI safety, according to colleagues.

The idea of AGI triggering civilizational upheaval isn’t isolated to Sutskever.

In May 2023, OpenAI CEO Sam Altman co-signed a public letter warning that AI technologies could pose an “extinction risk” to humanity. But while the letter sought to shape regulatory discussions, the bunker talk suggests deeper, more personal fears among OpenAI’s leadership.

The tension between those fears and OpenAI’s aggressive commercial ambitions came to a head later in 2023 when Sutskever, along with then-Chief Technology Officer Mira Murati, helped orchestrate a brief boardroom coup that ousted Altman from the company.

Central to their concerns was the belief that Altman was sidestepping internal safety protocols and consolidating too much control over the company’s future, sources told Hao.

Sutskever, once a firm believer in OpenAI’s original mission to develop AGI for the benefit of humanity, had reportedly grown increasingly disillusioned.

He and Murati both told board members they no longer trusted Altman to responsibly guide the organization to its ultimate goal.

“I don’t think Sam is the guy who should have the finger on the button for AGI,” Sutskever said, according to notes reviewed by Hao.

The board’s decision to remove Altman was short-lived.

Within days, mounting pressure from investors, employees, and Microsoft led to his reinstatement. Both Sutskever and Murati ultimately left the company.

The proposed bunker — while never formally announced or planned — has come to symbolize the extremity of belief among AI insiders.

It captures the magnitude of what OpenAI’s own leaders fear their technology could unleash, and the lengths to which some were prepared to go in anticipation of what they saw as a transformative, possibly cataclysmic, new era.

The Post has sought comment from OpenAI and Sutskever.

Share.

Leave A Reply

Exit mobile version