Biden administration on track meeting initial AI order actions
Federal agencies are making headway on the first actions required under President Joe Biden’s artificial intelligence executive order, meeting early deadlines to build up the federal AI workforce, fund government AI projects, and convene AI officials across the federal government.
The Office of Management and Budget, Office of Personnel Management, General Services Administration, National Science Foundation, and Department of Labor indicated they’re on track with requirements that were set to be completed within 30, 45, and 60 days of the Oct. 30 order (EO 14110). Keeping up with the early deadlines, researchers told FedScoop, could be important for actions down the line.
Most of the nine initial deadlines for agencies, according to a Bipartisan Policy Center timeline of the order, focused on actions involving federal government AI use and talent. Sabine Neschke, a senior policy analyst for BPC’s business and technology team, said “that’s a specific push to show that the government has enough AI talent in order to actually implement the rest of the executive order.”
Fulfilling the early deadlines adequately could also have a “domino effect” on later deadlines that were dependent on those actions, said David Evan Harris, a chancellor’s public scholar at UC Berkeley who researches AI. The deadlines aren’t “just picked randomly, they’re picked because they build upon one another,” Harris said.
AI talent surge
Among those early actions, the federal Chief AI Officers Council met Dec. 12 for its initial meeting where it “focused on the form and functions of the Council, including how to consistently and successfully implement EO 14110, OMB’s draft policy on AI governance, innovation, and risk management, and the National AI Talent Surge,” an OMB spokesperson said in an emailed statement.
That meeting, which was required to be held in the first 90 days after issuing the order, included representatives from all Chief Financial Officer Act agencies “as well as a range of representatives from small agencies,” the spokesperson said.
Eventually, agency chief AI officers — a position outlined by the order — will represent agencies on that council. While having a CAIO isn’t required until after OMB finalizes its corresponding guidance on the executive order, many agencies have already indicated they’ve designated their official.
Efforts to plan a talent surge and convene a task force on AI talent, which were set to be completed in the first 45 days, are also on track. An administration official told FedScoop in an email that “the AI and Tech Talent Task Force is very much underway and has been meeting regularly.”
An OMB spokesperson also said the “White House has completed planning for the AI Talent Surge, including identifying priority mission areas to which we plan to surge AI talent, types of talent needed for AI EO implementation and regulatory activities, and accelerated hiring pathways for this talent.”
OPM, which is also part of the talent surge efforts, indicated it was on track for its deliverables. Those include its recent authorization of direct hire authority and excepted service appointments to support the AI order. The order required OPM to conduct an evidence-based review of the need for “hiring and workplace flexibility” within 60 days of the order, including the need for direct-hire authority for AI positions.
An OPM spokesperson told FedScoop in an email that the new direct-hire authority gives agencies the ability to “perform work directly associated with implementing” the executive order and “allows the federal government to remain competitive with the private sector in recruiting AI talent.”
DOL, meanwhile, issued a request for information required under the order to help the agency identify AI, STEM-related, and other occupations “for which there is an insufficient number of ready, willing, able, and qualified United States workers.”
Funding and research
Other early actions focused on boosting government funding for AI and improving access to resources for AI research.
The board that oversees the GSA’s Technology Modernization Fund, for example, was required to consider prioritizing funding AI projects in the government within the first 30 days of the order. A GSA spokesperson said the body is exploring that idea and plans to share guidelines for AI proposals early this year.
“All AI proposals would require senior executive support and must include user testing, a risk mitigation plan, and clear metrics to evaluate success,” the GSA spokesperson said in an email. The spokesperson also encouraged agencies to reach out to the TMF’s program management office “as soon as they have ideas.”
NSF, for its part, indicated that it received proposals, as required under the order, from government counterparts identifying agency resources that could be used for the National AI Research Resource (NAIRR) pilot program, which is often described by the agency and others as “a shared national research infrastructure” for AI. The agency also plans to make that information public.
“We are working closely with a wide range of federal partners who submitted proposals for how their agencies can contribute to the pilot per the direction in the executive order,” an NSF spokesperson said in an email. “We expect to make the full breadth of those contributions public upon the launch of the pilot in January.”
While most agencies indicated they were on track, the Department of Transportation didn’t respond to FedScoop requests for comment on its 30-day deadline. That action required Secretary Pete Buttigieg to direct the department’s “Nontraditional and Emerging Transportation Technology (NETT) Council to assess the need for information, technical assistance, and guidance regarding the use of AI in transportation.”
What’s next?
The next deadlines, which are coming up at the end of January, are more diverse in terms of focus, including actions related to AI safety and security; promoting innovation and competition; advancing equality and civil rights; and protections, according to BPC’s timeline.
Those include launching the NAIRR pilot, issuing proposed regulations addressing foreign malicious cyber actor use of U.S. Infrastructure as a Service (IaaS) products, and assessing AI cybersecurity risks in critical infrastructure sectors.
Going forward BPC’s Neschke said she’ll be watching for the National Institute of Standards and Technology’s work to develop a companion resource to its AI Risk Management Framework focused on generative AI and information about how AI is impacting the workforce. She also said she’ll be watching Congress to see what actions they take related to the order.
“While the executive order does set off a lot of great momentum, it only holds so much weight just because it can be overturned,” Neschke said. “And we’re really looking at what Congress will do next — whether they will put out legislation to provide more durability on these initiatives.”
Rebecca Heilweil and Caroline Nihill contributed to this article.