More

    British and US cyber agencies warn of new vulnerabilities in AI systems

    British and U.S. cybersecurity authorities have jointly released a 20-page guide on Monday. This document offers advice on building AI systems resilient against threats like attacks by malicious actors and state-sponsored hackers. The guidelines, agreed upon by 18 countries, including G7 members but excluding China and Russia, address AI vulnerabilities. These vulnerabilities fall into categories such as performance, unauthorized user actions, and sensitive model information extraction. The guide provides practical steps for designing, developing, deploying, and operating AI systems with minimal cybersecurity risk.

    Regarded as a crucial move, the release of these guidelines aims to ensure the secure development and deployment of AI capabilities. It specifically targets cybersecurity vulnerabilities that arise from integrating AI tools with other systems, not just their misuse by bad actors.

    In August, the NCSC issued a specific warning about “prompt injection attacks”. These attacks are a major security flaw affecting large language models (LLMs), a type of machine learning used by ChatGPT. The new guidance aims to secure systems by tackling the cybersecurity risks unique to AI technologies and providing safeguards for the generated model outputs.

    Teams from NCSC and CISA’s sister agencies in 17 other countries developed the document. They also received input from organizations like Microsoft, Google, and OpenAI. Agencies from 17 countries endorsed the initiative, highlighting the UK’s leadership in AI safety.

    During a launch event at NCSC’s headquarters, Jonathan Berry, the Minister for AI and Intellectual Property, stressed that this guidance marks the beginning of the journey to secure AI. He did not outline immediate legislative plans. However, he mentioned the development of a voluntary code of practice for AI security. The ultimate goal is to establish an international standard.

    Latest articles

    Related articles