- Exclusive
- AI
The government is struggling to track its AI. And that’s a problem.
Efforts to inventory artificial intelligence uses within major federal agencies have so far been inconsistent, creating a patchwork understanding of the government’s use of the budding technology.
Regulating AI is a cornerstone of the current administration’s agenda, but the push to figure out where the federal government was using the technology began before President Joe Biden took office. In the final weeks of the Trump administration, the White House published an executive order calling on federal agencies to report all current and planned uses of AI and publish those results. The goal, according to Executive Order 13960, was to document how the U.S. government is using AI and establish principles for the technology.
More than two years later, the process of actually developing these inventories hasn’t gone smoothly. Unlike other government AI initiatives such as the Blueprint for an AI Bill of Rights and the National Institute for Science and Technology’s Risk Management Framework, the 2020 executive order carries the force of law and has terms that require compliance, argues Christie Lawrence, an affiliate at Stanford’s RegLab.
The way agencies are complying with the executive order points to potential lessons for federal agency implementation of future executive orders and statutes related to AI regulation, she told FedScoop.
In the absence of a U.S. national AI strategy, said Lawrence, “compliance with Executive Order 13960 is really important because it kind of functions — along with some other documents — as the sort of American government strategy towards AI.”
The issues raised by the executive order highlight some of the broader hurdles that could face the big push to regulate AI, including defining the technology and identifying where the technology is actually being deployed. Notably, the Biden administration expects to issue a new AI-focused executive order soon.
Researchers at Stanford Law School, including Lawrence, who looked at implementation challenges for America’s AI strategy, and the Electronic Privacy Information Center (EPIC), both previously raised concerns about widespread lagging compliance with the Trump administration executive order among agencies. The White House did not respond to a request for comment.
FedScoop reviewed how the more than 20 larger Chief Financial Officer Act agencies that the executive order applies to inventory their AI technology. The findings showed a lack of standardization across the government. While some agencies offer detailed inventories, others provide little information — and some appear to miss use cases disclosed publicly elsewhere.
There also isn’t a public deadline for agencies to update their inventories for the current fiscal year, making it difficult to track progress.
Among the findings: Several agencies — including the Transportation Security Agency and the Small Business Administration — didn’t include apparent use cases publicly disclosed elsewhere. Meanwhile, the Department of Transportation said it disclosed a ChatGPT use in “error,” which FedScoop previously reported.
“We need to know the full universe of AI use cases that are effective today. And if we don’t have that, we’re not getting the full picture and we can’t really rest easy knowing that,” argues John Davisson, an attorney at EPIC. “The federal government’s having to play catch up with its own agencies by, now, asking them to disclose what AI systems they’re using. But things being as they are, step one is: Disclose what you’re using right now.”
The December 2020 executive order required agencies — except the Defense Department and those in the intelligence community — to inventory their current and planned AI uses, ensure uses were consistent with the order, share inventories with each other, and make non-classified and non-sensitive uses public on an annual basis.
The order also directed the Federal Chief Information Officers Council (CIO Council) to create guidance for the inventories. The initial deadline the council set for agencies to share their first inventories with each other on the MAX Federal Community, a federal information-sharing website, was March 22, 2022. Agencies began publishing inventories online in June 2022, according to the National Artificial Intelligence Initiative’s webpage for the order.
The public guidance from the CIO Council for 2023, however, doesn’t include a date by which they should be submitted to the MAX system. In response to detailed questions about a deadline, expectations for public inventories, and compliance for the current year, the agency sent a brief summary of its responsibilities under the order.
Key Documents
Executive Order 13960 | AI.gov Published Inventories List |
2021 CIO Council Guidance | 2023 CIO Council Guidance |
Other requirements established by the order to streamline the government-wide AI strategy appear to be running behind, too.
The Office of Personnel Management was supposed to create an inventory of rotational programs focused on increasing the number of employees with AI experience at federal agencies and issue a report focused, also, on boosting AI expertise — both within a year of the 2020 EO. In response to a request for comment, the agency directed FedScoop to a memo focused on AI competencies meant to comply with the AI in Government Act, and said that once a data call the agency is working on with the Chief Human Capital Officers Council is complete, it can start compiling a report.
Perhaps most notable is that several agencies seemed to exclude prominent examples of AI use cases — including those that do or could impact the public — from their inventories.
The inventory created for the Transportation Security Agency, for example, includes a single example of AI — a COVID-19 risk assessment algorithm program called Airport Hotspot Throughput — but does not mention its facial recognition program, perhaps one of the agency’s most controversial deployments of machine learning-based technology. The Department of Homeland Security did not respond to a FedScoop request for comment.
HUD, meanwhile, maintains that it has no AI use cases — despite a report submitted to the Administrative Conference of the United States in February 2020 that identified a prototype chatbot at the agency. HUD similarly publicly identified the use of AI in a December 2020 report on its progress in implementing the 21st Century Integrated Digital Experience Act. In that report, HUD said the Federal Housing Administration would “expand communication channel offerings to include live chat, SMS/MMS, AI chatbot, and Intelligent IVR.” HUD didn’t respond to FedScoop requests for comment.
It is also unclear how agencies should distinguish between “planned” use cases, which agencies are supposed to include, and AI projects that are in the process of research and development, which are not supposed to be included. For example, several AI uses discussed in a July 2022 presentation for EPA’s homeland security office are not included in the EPA’s inventory because, a spokesperson explained, the “activities described in the presentation are still in development.”
The Small Business Administration’s inventory, which is dated May 2023, states that after investigating its Federal Information Security Modernization Act (FISMA) systems, it did not discover any use cases. Still, the inventory does not include an AI use case for vetting loan applications. This application was discussed in an SBA announcement on the agency’s website, and in an Inc Magazine article, both published in May.
“During phase one, our focus was on how SBA Program Offices were using AI (including ML + RPA) to support their own internal operational efficiencies,” an SBA spokesperson told FedScoop.
SBA’s response reflects a larger trend: agencies used different methodologies to actually develop their inventories. Of the agencies that responded to FedScoop’s request for comment, some seemed to have determined their use cases by organizing a call out within their agencies — and asking various departments to share different ways they’re using AI.
The Department of Labor found that most of its AI use cases have been managed by its AI Center of Excellence, and the agency found other examples by reaching out to business units. Other agencies, including the EPA, the General Services Administration and Education Department, conducted “data calls” to collect information about AI uses.
Information that officials included about each disclosed use case across the agencies varies widely. Some agencies list specific contact information for different AI use cases, like the Department of Commerce, or include information like when the use began and whether it was contracted work, as USAID did. Others simply list the name of each use case, a summary, and the entity responsible for it — that approach was taken by both the Department of State and Social Security Administration.
Relatedly, there doesn’t appear to be a standardized procedure for removing mentions. The Department of Transportation deleted a reference to the Federal Aviation Administration’s Air Traffic Office using ChatGPT for code-writing assistance after FedScoop inquired about the technology, saying the example was included in “error.”
Agencies have also published their inventories on different timelines. Though the first inventories were expected to be shared with other agencies in March 2022 — per the initial CIO Council guidance — some agencies appear to have completed theirs later. For example, NASA’s fiscal year 2022 inventory is dated October 2022, the Department of Education said it completed its initial inventory in February 2023, and OPM appears to have only a 2023 inventory.
At the same time, while a deadline for the current year isn’t clear, some agencies, such as the General Services Administration and Social Security Administration, said they already completed updates to their inventories for 2023.
Several agencies, including the Department of Housing and Urban Development, the Justice Department, and the Department of the Interior, did not provide responses to FedScoop inquiries about updating their inventories and their overall process. While NASA has a public inventory from 2022 and 2023, the agency’s inventory is not included on an AI.gov list of inventories-to-date.
Finally, it’s difficult to tell whether the executive order actually helped agencies sort through whether their AI use cases lined up with established principles — which was a critical goal of the executive order.
Many agencies did not respond to a request for comment, but the Department of Labor, USAID, and USDA all said none of their use cases were inconsistent with the order. A State Department spokesperson said it was “employing a rigorous review process and making necessary adjustments or retirements as needed.” But it didn’t elaborate on what uses might need that adjustment or retirement.
Ultimately, the patchwork approach to Executive Order 13960 is a reminder that senior leadership within both the White House and the federal agencies need the right staff, resources, and authority to implement AI–related legal requirements, argued Lawrence, from Stanford.
For Davisson, the attorney from EPIC, it’s critical for agencies to have clarity about their obligations.
“Follow-through is really important. That applies both to the White House and to the agencies that are trying to execute on an executive order,” he added. “You can’t just put it on paper and assume that the job is done.”
Editor’s note, 8/4/23 at 3:00 p.m.: This piece was updated to note NASA’s 2023 AI use case inventory, which a NASA employee referenced in response to a request for comment for a subsequent FedScoop piece on a related topic.