Released in the waning days of the current administration, the interim final rule on AI diffusion streamlines licensing hurdles for AI chip sales to allied and partner nations while constraining "countries of concern" from acquiring advanced AI systems and the computing power used to train them, the White House said.
Commerce Secretary Gina Raimondo said the rules help to build a "trusted technology ecosystem" around the world by ensuring that the U.S. can "continue to broadly share the benefits" of AI technology with friendly countries while keeping these systems "out of the hands" of adversaries, who the Biden administration has said could use this tech to support cyberattacks and mass surveillance among other risks.
"Managing these very real national security risks requires taking into account the evolution of AI technology, the capabilities of our adversaries, and the desire of our allies to share in the benefits of this technology," Raimondo said in a statement Monday. "We've done that with this rule."
The final rule was quickly met with pushback from leading industry players such as Nvidia Corp., which called it "unprecedented and misguided."
"This sweeping overreach would impose bureaucratic control over how America's leading semiconductors, computers, systems and even software are designed and marketed globally," Ned Finkle, the company's vice president of government affairs, argued in a blog post.
"The new rules would control technology worldwide, including technology that is already widely available in mainstream gaming PCs and consumer hardware," Finkle said. "Rather than mitigate any threat, the new Biden rules would only weaken America's global competitiveness, undermining the innovation that has kept the U.S. ahead."
He said the company was looking forward to a less active and more industry-friendly approach from the upcoming Trump administration.
The new rule revises the Export Administration Regulations' controls on advanced computing integrated circuits and bars the transfer to nontrusted actors AI "model weights" for certain advanced "closed-weight dual-use" AI models. The Commerce Department's Bureau of Industry and Security has said it will host a virtual public briefing to "discuss the details" of the rule on Wednesday.
The rule also creates new license exceptions and authorizations, including placing no restrictions on chip sales to 18 key allies and partners in Europe and elsewhere, which will enable "jurisdictions with robust technology protection regimes and technology ecosystems aligned with the national security and foreign policy interests of the United States to benefit from seamless large-scale purchases," the White House said.
Chip orders with collective computation power up to about 1,700 advanced graphics processing units, or GPUs — which are used in training AI models — will also not require a license under the new rule, which will help to rapidly accelerate "low-risk shipments" of U.S. technology around the world, including the "overwhelming majority" of chip orders that are being placed "by universities, medical institutions, and research organizations for clearly innocuous purposes."
Additionally, the rule allows entities that meet high security and trust standards in close allies to obtain highly trusted "universal verified end user" status, and those that meet these standards and are based in any destination that is not a country of concern to apply for "national verified end user" status in order to boost their purchasing powers, the White House said.
While nonverified entities outside close allies can still purchase up to the equivalent of 50,000 advanced GPUs per country in order to ensure "that U.S. technology is available to service foreign governments, healthcare providers, and other local businesses," the rule restricts "countries of concern" from using advanced semiconductors sold abroad to train advanced AI systems, although access is still permitted for "for general-purpose applications from telecommunications to banking," the White House said.
The administration said the limits on model weight transfers permits these models to be "stored and used securely around the world while helping prevent illicit adversary access."
The rule announced Monday is the latest move by the Biden administration over the past four years to cultivate a secure and trusted technology ecosystem for the responsible use and diffusion of AI.
These measures include a comprehensive executive order issued in October 2023 that set out a road map for protecting consumers and workers from privacy, discrimination and other potential harms presented by the widespread deployment of AI.
--Editing by Brian Baresch.
For a reprint of this article, please contact reprints@law360.com.