VENDORiQ: Microsoft Introduces MS Security Copilot

Microsoft has introduced MS Security Copilot, an AI-powered assistant that combines OpenAI's language model with Microsoft's security-specific mode. The goal is to help security professionals detect and respond to threats more effectively. Is this just more ‘GTP’ hyperbole, or something significant?

The Latest

Microsoft released Microsoft Security Copilot – an AI assistant that combines OpenAI’s large language model with Microsoft’s security-specific mode. Microsoft has significant data resources and capabilities for global threat intelligence that it can bring to customer-specific information. The challenge is how to translate the scale of information available into a form that security professionals can digest, and to which they can respond. Microsoft announced that its platform will “…provide a safe and compliant experience that facilitates the use of data with more advanced generative AI models to achieve improved security outcome”.

Why It’s Important

The large volumes of data generated can be overwhelming for enterprise security professionals  to analyse, making it difficult to detect potential threats in a timely manner. Microsoft’s Security Copilot, an integration of OpenAI and Microsoft’s existing services, will up the ante for security tools that are capable of quickly processing and analysing large data, allowing them to identify and, importantly, communicate potential threats more efficiently. 

In summary, Copilot is an assistive technology that adds a conversation front end on top of very large amounts of security signal data.

One of the benefits of Security Copilot is that unlike some of the other Copilot announcements by Microsoft (e.g. for Dynamics, Office 365, PowerPlatform), the data being fed into this model is already well-defined and protected. The risks of oversharing data (internal data leakage) are therefore small.

IBRS expects to see an explosion of existing security vendors adding GPT like AI into their own products. We’ve long stated that most AI will be consumed through existing SaaS products. The true test for which vendors will dominate the market for AI-based big-data security analysis lies in their ability to produce AI results with fewer errors/false positives. AI models are not immune to errors, and Microsoft Security Copilot is no exception. Microsoft has implemented a closed-loop learning system that includes a built-in feedback feature. Users can raise their concerns and issues with regard to errors and have the AI model digest this for future refinement.

IBRS expects such technologies to continue to improve their ability to detect and make recommendations for responding to emerging threats, but this would not warrant the loss of human security professionals needed to maintain end–to-end security in their organisation. It is a productivity aid for skilled professionals, and also has the potential to augment the capabilities of less-skilled security professionals.

Based on Microsoft’s briefings, they deployed Copilot at a very fast pace: IBRS expected such solutions to be another six months coming, so Microsoft beat expectations significantly.  However, they may run the risk that the integrations and models are not as well battle-tested and hardened.

There is an old adage: never buy version 1 software. In this case, Security Copilot can be considered version 1, albeit a very capable version 1. However, IBRS expects this solution will improve rapidly.

Who’s Impacted?

  • CISO
  • Security teams

What’s Next?

  • Enterprises should understand that adoption of AI products such as Copilot are not a replacement for vigilant security professions armed with SIEM solutions. At this early stage of deployment, such solutions will be aids for investigating and providing recommendations. They do have the potential to make such activities faster and more thorough, enabling organisations to address the growing demands on security teams.

Related IBRS Advisory

1. Use Security Principles To Guide Security Strategy

2. Practical Security Guidance For Organisations

Trouble viewing this article?

Search