Are some AGI systems too risky to release? Meta says so.

The company is pausing the development of AGI systems deemed too "high" or "critical" risk.
 By 
Christianna Silva
 on 
The Meta AI logo is displayed on a mobile phone with the Meta AI app visible on a tablet in this photo illustration in Brussels, Belgium, on January 26, 2025.
Are Meta’s AGI plans shifting? Credit: Photo by Jonathan Raa/NurPhoto via Getty Images

Since AI came into our world, creators have put a lead foot down on the gas. However, according to a new policy document, Meta CEO Mark Zuckerberg might slow or stop the development of AGI systems deemed too "high risk" or "critical risk."

AGI is an AI system that can do anything a human can do, and Zuckerberg promised to make it openly available one day. But in the document "Frontier AI Framework," Zuckerberg concedes that some highly capable AI systems won't be released publicly because they could be too risky.

The framework "focuses on the most critical risks in the areas of cybersecurity threats and risks from chemical and biological weapons."


You May Also Like

"By prioritizing these areas, we can work to protect national security while promoting innovation. Our framework outlines a number of processes we follow to anticipate and mitigate risk when developing frontier AI systems," a press release about the document reads.

For example, the framework intends to identify "potential catastrophic outcomes related to cyber, chemical and biological risks that we strive to prevent." It also conducts "threat modeling exercises to anticipate how different actors might seek to misuse frontier AI to produce those catastrophic outcomes" and has "processes in place to keep risks within acceptable levels."

If the company determines the risks are too high, it will keep the system internal instead of allowing public access.

"While the focus of this Framework is on our efforts to anticipate and mitigate risks of catastrophic outcomes, it is important to emphasize that the reason to develop advanced AI systems in the first place is because of the tremendous potential for benefits to society from those technologies," the document reads.

Still, they're not denying that the risks are there.

Topics Meta

Mashable Image
Christianna Silva
Senior Culture Reporter

Christianna Silva is a senior culture reporter covering social platforms and the creator economy, with a focus on the intersection of social media, politics, and the economic systems that govern us. Since joining Mashable in 2021, they have reported extensively on meme creators, content moderation, and the nature of online creation under capitalism.

Before joining Mashable, they worked as an editor at NPR and MTV News, a reporter at Teen Vogue and VICE News, and as a stablehand at a mini-horse farm. You can follow her on Bluesky @christiannaj.bsky.social and Instagram @christianna_j.

Mashable Potato

Recommended For You
NVIDIA CEO Jensen Huang says AGI is here — sort of
Jensen Huang, chief executive officer of Nvidia Corp., during a news conference at the Nvidia GTC conference

Child experts: AI toys too risky for young kids
Boy plays with a stuffed bear.

Pranksters and pickup artists are using Meta Ray-Ban glasses to harass strangers for content
Man with meta ray ban glasses with creepy grin

Meta can read your WhatsApp messages, lawsuit alleges
whatsapp logo

Meta reverses course, will keep metaverse partially VR after all
Horizon Worlds logo seen on a smartphone.

Trending on Mashable
NYT Connections hints today: Clues, answers for April 3, 2026
Connections game on a smartphone

Wordle today: Answer, hints for April 3, 2026
Wordle game on a smartphone

Google launches Gemma 4, a new open-source model: How to try it
Google Gemma

NYT Strands hints, answers for April 3, 2026
A game being played on a smartphone.

The biggest stories of the day delivered to your inbox.
These newsletters may contain advertising, deals, or affiliate links. By clicking Subscribe, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up. See you at your inbox!