Supplier Security & Privacy Assurance (SSPA) Program Guide: Understanding the new Section K requirements

Supplier Security & Privacy Assurance (SSPA) Program Guide: Understanding the new Section K requirements
The longer read
The new version 10 DPRs introduces Section K, with requirements focused on the concept of ‘AI Systems’.
What is an ‘AI System’?
Microsoft defines this as.
An engineered system that applies an optimized model so that the system can,
- for a given set of human-defined objectives,
- make predictions[1], recommendations, or decisions influencing the environments it interacts with.
Such a system may operate with varying levels of automation.
AI High Risk
Microsoft also identifies a higher risk category of AI System use which it describes as ‘Sensitive Use’. Sensitive Use automatically triggers a requirement for the supplier to achieve ISO/IEC 42001 certification (see below).
This occurs when the reasonably foreseeable use or misuse of an AI System could affect an individual in the following ways:
- Consequential impact on legal position or life opportunities.
- Risk of physical or psychological injury
- Threat to human rights[2]
Microsoft gives the following, non-exhaustive, examples.
Figure 1: Supplier Security & Privacy Assurance (SSPA) v 10
Are we subject to section K?
The SSPA Program Guide advises that a supplier selects this profile option if the supplier will.
- ‘provide services to Microsoft involving AI Systems including using tools, systems, or platforms with AI Technology to train and build intelligent systems[3] to create entirely new content such as images, sounds, videos, insights, analysis, and/or text’.
The power of a [,] comma is highlighted here.
It could be read as.
- ‘provide services to Microsoft involving AI Systems [,] including using tools, systems, or platforms with AI Technology to train and build intelligent systems[4] to create entirely new content such as images, sounds, videos, insights, analysis, and/or text’.
This says you need to select the AI Supplier profile if you provide services to Microsoft which involves the [mere] use of AI Systems, which are defined as ‘an engineered system that applies an optimized model so that the system can, for a given set of human-defined objectives, make predictions*, recommendations, or decisions influencing the environments it interacts with. Such a system may operate with varying levels of automation. *Predictions can refer to various kinds of data analysis or production (including translating text, creating synthetic images, or diagnosing a previous power failure)’. In 2025, this is likely to capture many suppliers who create content, images, write code and utilise the inbuilt power of AI in most commercial off the shelf (COTS) software.
It could also be read as
- ‘provide services to Microsoft involving AI Systems including tools, systems, or platforms with AI Technology to train and build intelligent systems[5] to create entirely new content such as images, sounds, videos, insights, analysis, and/or text’.
This says you need to select the AI Supplier profile ONLY if you ‘train and build intelligent systems’, that can be used to create entirely new content such as images, sounds, videos, insights, analysis, and/or text. This is unlikely to capture the majority of Microsoft suppliers, as most will not be building and training intelligent systems.
Consider the typical scenario.
Q. If a supplier uses a platform like Adobe Firefly to create new marketing images, or Claude.ai to create new long-form written content as part of a Partner engagement marketing program, will this trigger the section K requirement?
Potential answers.
A1. No. These are examples of suppliers who merely use AI systems to fulfil contractual obligations. They are not suppliers who ‘train and build intelligent systems’ on behalf of Microsoft.
A2. Yes. These COTS AI tools meet the Microsoft definition of an ‘AI System’. They output ‘predictions’, the Microsoft definition of which includes synthetic images or text based on human-defined objectives.
Are there any clues as to which is the correct answer? Yes. The program guide states that ‘All suppliers providing AI Systems will be required to provide Independent Assurance options’. It therefore seems reasonable that A1 is the correct answer in this scenario as the supplier is using not providing the AI System.
We are subject to section K, now what?
If you are required to conform with section K, this SSPA AI Systems approval will include documentation on the Processing of Personal and/or Microsoft Confidential Data, impacts to people, organizations and society, and acceptance of appropriate supplier certifications.
The supplier will have required signed agreements and/or Microsoft internal reviews completed before purchasing can proceed.
Note that all suppliers requiring independent assessment against Section K must use a Microsoft preferred assessor.
What about the ISO/IEC 42001 alternative?
ISO/IEC 42001 can be offered to validate compliance against Section K of the DPR and is required for any AI sensitive use cases (see above).
ISO/IEC 42001 is an international management system standard (MSS) that specifies requirements for establishing, implementing, maintaining, and continually improving an Artificial Intelligence Management System (AIMS) within organisations. It is designed for entities providing or utilising AI-based products or services, ensuring responsible development and use of AI systems. Implementation involves putting in place policies and procedures for the sound governance of an organization in relation to AI, using the Plan‐Do‐Check‐Act methodology.
This supplier approval will only be granted upon the Independent Assessment being accepted by SSPA.
iCompli Ltd.
iCompli assessors are all Certified Information Privacy Professionals (CIPP/E) with the International Association of Privacy Professionals (IAPP).
We have many years of experience in successfully completing your independent SSPA assessments and liaising with Microsoft to facilitate your recognition as an approved Supplier.