Skip to main content

Urgent action needed to secure UK AI research

17 April 2025

Urgent action is needed to secure the UK’s AI research ecosystem against hostile threats, according to a report.

Co-authored by academics at Cardiff University’s Security, Crime and Innovation Institute (SCIII) and led by the Alan Turing Institute’s Centre for Emerging Technology and Security (CETaS), the report focuses on growing fears that UK AI research is a particularly high-priority target for state threat actors seeking technological advantage.

The report argues that awareness of security risks is not consistent across the academic sector and there is a lack of incentives for researchers to follow existing government guidance on research security.

Culture change is therefore urgently needed, including balancing tension between research security and the pressures academics face to publish their research.

Thirteen recommendations for Government and Academia from 'Securing the UK’s AI Research Ecosystem'

Currently there is fragmentation between measures enforced by the Government and the clarity with which available support is communicated to academics. Understanding how the Government and academics can become more aligned is fundamental to ensuring the resilience of UK-led AI research.

Annie Benzie, Security, Crime and Intelligence Innovation Institute

The report also highlights difficulties for academics both in assessing risks of their research – for example future misuse – and the need to carry out time-consuming due diligence processes on international research partners, without a clear view of the current threats.

The report offers 13 recommendations to help government and academia to build the resilience of this growing research ecosystem.

The UK’s prioritisation of furthering AI research is accompanied by numerous benefits, but also presents risks regarding threats to the UK’s AI research ecosystem. Difficulties in predicting the use of dual-use technologies and a lack of clarity in the existing guidance currently causes conflicts between securing such research and preserving academic freedom. Both the government and academia are urged to address these issues and strike a balance between the two by driving and supporting a culture shift.

Mr Sam Williams Research student

Recommendations for UK Government include a need for regular guidance from the Department for Science, Innovation and Technology (DSIT), with support from the National Protective Security Authority (NPSA), on the international institutions deemed high-risk for funding agreements and collaborations, and more dedicated funding to grow the Research Collaboration Advice Team to support academic due diligence.

The report also urges UKRI to provide grant funding opportunities for research security activities.

Among recommendations for academia, the report’s authors believe that all academic institutions should be required to deliver NPSA accredited research security training to new staff and postgraduate research students as a prerequisite for grant funding.

Share this story