National Audit Office NAO

Use of artificial intelligence in government

2024 UK2024AIinGovernment
SCALE
  • - 74 number of AI use cases already deployed as reported by government bodies responding to our survey
  • - £101mn the Incubator for Artificial Intelligence’s estimate of its five year funding requirement to 2028-29 (before inflation)
COMPLIANCE FOCUS
  • - 2018 AI Sector Deal to promote the use of AI (artificial intelligence) in the UK, including within the public sector
  • - 2021 National AI Strategy
  • - Technology Code of Practice
PERFORMANCE ASPECT
  • - strategy and governance
  • - AI usage and opportunities
  • - lans for supporting the testing, piloting and scaling of AI
  • - progress in addressing barriers

One aim of the National AI Strategy is for the public sector to become an exemplar of safe and ethical deployment of AI. The activities to deliver this aim sit across many bodies and have not been underpinned by supporting governance arrangements, clear accountabilities, an implementation plan or performance metrics to track progress. The National AI Strategy – AI Action Plan published in July 2022 summarised activity, but did not set out outcome measures or detailed implementation plans to support the aim for the public sector to become an exemplar. Initially a cross-government AI Strategy Delivery Group was established by the Office for Artificial Intelligence to oversee delivery, but this was disbanded in March 2022.

The Cabinet Office’s Central Digital and Data Office (CDDO) and the Incubator for Artificial Intelligence (i.AI) (within the Cabinet Office), and Department for Science, Innovation & Technology (DSIT) all have roles in AI adoption in the public sector, and there is therefore potential for overlap . For example, CDDO is responsible for setting the strategic direction for government on digital, data and technology, while the i.AI has a role in delivering shared data and AI infrastructure. DSIT is responsible for developing governance frameworks, guidance, and standards for AI and data in the wider economy, and is leading on public sector innovation.

While the strategy is intended to be public-sector wide , these governance structures do not include public sector representation beyond central government, such as schools, police and the wider health sector. The proposed governance of the strategy is also largely separate from the cross-government governance structure established to oversee wider AI policy delivery led by DSIT, potentially losing the benefits of a coordinated approach and increasing risks to delivery.

In 2023, CDDO carried out indicative analysis to identify potential productivity gains across the civil service and wider public sector. It identified that almost a third of tasks in the civil service (those that it defined as routine) could be automated. It did not examine the feasibility of delivering these productivity gains, or make an assessment of cost.

Appropriate digital and data foundations need to be in place to support the transformational benefits of AI. Large quantities of good-quality data are important to train, test and deploy AI models. Our survey found that limited access to good-quality data was a barrier to implementing AI and central government support was important to address this. The government recognises more action is needed to address legacy issues and to improve data access and quality to avoid limiting the adoption of AI in the public sector.

Government standards and guidance to support responsible and safe adoption of AI are still under development. The Algorithmic Transparency Recording Standard ( ATRS ), developed to improve transparency and provide information about the algorithmic tools used in government, is not widely used. In February 2024 DSIT announced it would make ATRS a mandatory requirement for all government departments. DSIT, which also leads the government’s strategy and engagement in global digital technical standards (including on AI) told us that there are opportunities for it to work more collaboratively across government to ensure that government standards for AI take global standards into consideration as these develop. Some government bodies we interviewed described finding it difficult to navigate the range of guidance available and being unclear on where to go for a definitive view of what they need to consider.

CDDO recognises that lack of skills is a major challenge to the successful adoption of AI, noting that there is currently limited capacity within the system to fully exploit and scale the opportunities presented by AI. Government bodies can address skills shortages through the use of contractors, agency workers, and temporary staff, with an estimated one-third of digital and data professionals in the civil service made up of these groups.

CDDO is responsible for oversight and assurance of digital and technology spend across government. As part of these controls, departments must comply with the Technology Code of Practice, which includes privacy, security and data protection requirements, as well as requirements to comply with ethics guidance in cases of automated decision-making. In 2024, CDDO expects to roll out a new process across government to improve how it identifies digital and technology spend that has a substantive or high-risk AI component, to ensure these cases are given appropriate scrutiny

Code (gexf) to continue analysis with GephiTerminology graph
svg
The items above were selected and named by the e-Government Subgroup of the EUROSAI IT Working Group on the basis of publicly available report of the author Supreme Audit Institutions (SAI). In the same way, the Subgroup prepared the analytical assumptions and headings. All readers are encouraged to consult the original texts by the author SAIs (linked).