Open letter from 800 leaders urges a pause on superintelligent AI until safety is proven
FLI's open letter, signed by 800+ leaders, urges a pause on superintelligent AI until safety is verified, citing low public support and demand for regulation.
FLI's open letter, signed by 800+ leaders, urges a pause on superintelligent AI until safety is verified, citing low public support and demand for regulation.
© RusPhotoBank
More than 800 prominent politicians, scientists, business leaders, and public figures — among them “the godfather of AI” Geoffrey Hinton, Baidu co-founder Zhang Yaqin, actor Meghan Markle, and writer Stephen Fry — have signed an open letter urging an immediate ban on the development of superintelligent artificial intelligence.
The letter, published by the nonprofit Future of Life Institute (FLI), calls for a halt to building systems that could surpass human intelligence until the scientific community can verify their safety and the public signals clear support for their deployment. FLI’s research indicates that only 5% of Americans back the current, largely unchecked trajectory of AI, while nearly 75% favor strict regulation.
FLI president Max Tegmark emphasized that the real risk does not come from competing companies or nations, but from what humanity is creating itself. He noted that the goal is not to shut down research, but to prevent a loss of control over systems capable of acting autonomously and, potentially, dangerously. The nuance here matters: the appeal is for a pause with safeguards, not a retreat from innovation.
Among the signatories are Turing Award winners Yoshua Bengio and Yao Qizhi, former U.S. National Security Advisor Susan Rice, former Chairman of the Joint Chiefs of Staff Mike Mullen, as well as Steve Wozniak and Richard Branson. The call lands amid an accelerating race by tech giants — OpenAI, Google, and others — to build so-called artificial general intelligence, a push the signers frame as a possible turning point in human history. The timing, in that light, hardly looks coincidental.