ISO 42001 - Control A.10.1 - Third-Party and Customer Relationships
Control A.10.1 – Third-Party and Customer Relationships
In todays article by Kimova AI, we explore Annex A Control A.10 / B.10 – Third-Party and Customer Relationships, a critical control that ensures AI-related risks are effectively managed not only within the organization, but also across external parties such as vendors, partners, and customers.
From an ISMS auditor’s perspective, third-party relationships are one of the most significant risk vectors in AI systems. Organizations often rely on external data providers, cloud platforms, pre-trained models, or AI vendors—making it essential to extend AI governance beyond internal boundaries.
Objective of Control A.10 / B.10
The primary objective of this control is to ensure that:
- AI-related risks arising from third parties are identified, assessed, and managed
- Responsibilities between the organization and external parties are clearly defined
- Customers are provided with appropriate information and assurances regarding AI systems
This control reinforces the idea that AI accountability cannot be outsourced, even when systems or components are externally sourced.
Scope of Third-Party and Customer Relationships
This control applies to:
- AI vendors and solution providers
- Cloud and infrastructure providers
- Data suppliers and aggregators
- External developers and consultants
- Customers who use or are impacted by AI systems
Each of these relationships introduces unique risks that must be governed systematically.
Why This Control Is Important
Weak governance over third-party and customer relationships can lead to:
- use of non-compliant or biased datasets
- lack of transparency in AI model behavior
- security vulnerabilities in third-party systems
- contractual and legal disputes
- inability to demonstrate compliance during audits
As auditors, we frequently see organizations with strong internal controls but weak third-party oversight—creating significant compliance gaps.
Key Requirements Under Annex A Control A.10 / B.10
To demonstrate conformity with ISO/IEC 42001, organizations should ensure:
- Due Diligence Is Performed
Third parties involved in AI systems are assessed for security, quality, ethical practices, and compliance.
- Roles and Responsibilities Are Defined
Contracts clearly specify responsibilities for data handling, model performance, risk management, and compliance.
- AI Requirements Are Included in Agreements
Contracts and SLAs include AI-specific clauses such as transparency, explainability, and incident handling.
- Monitoring and Review Mechanisms Exist
Third-party performance and compliance are regularly monitored and reviewed.
- Customer Communication Is Ensured
Customers are informed about the use of AI systems, including risks, limitations, and responsibilities where applicable.
Audit Perspective: What Evidence Is Expected
During ISO/IEC 42001 audits, auditors typically review:
- third-party risk assessments and due diligence records
- contracts and agreements with AI-specific clauses
- supplier evaluation and monitoring reports
- customer communication materials
- incident and escalation procedures involving third parties
At Kimova AI, we often see that organizations with structured third-party governance frameworks achieve higher audit success rates and stronger AI risk management.
Implementation Best Practices
Organizations can strengthen compliance with this control by:
- maintaining a third-party inventory for AI systems
- integrating AI risk criteria into vendor selection processes
- including AI governance clauses in contracts and SLAs
- conducting periodic third-party audits and assessments
- aligning third-party controls with ISMS and AIMS frameworks
These practices ensure that AI governance extends across the entire ecosystem.
Conclusion
Annex A Control A.10/B.10 emphasizes a critical principle of modern AI governance: AI risks do not stop at organizational boundaries—they extend across the entire value chain.
By effectively managing third-party and customer relationships, organizations can reduce risk, enhance transparency, and demonstrate compliance with ISO/IEC 42001.
At Kimova AI, we believe that strong external governance is essential for building trustworthy and scalable AI systems.
In tomorrow’s article by Kimova.AI, we’ll explore Annex A.10.2 – Allocating Responsibilities, where we’ll explore how organizations can clearly define and assign roles and responsibilities for AI systems to ensure accountability, effective governance, and compliance across all stakeholders.