← Back to BlogGovernance

Third-Party AI Risk: Assessing Your Vendors' AI Usage

MR
Michael Rodriguez
Compliance Director
·August 5, 2025·11 min read

Your third-party risk program probably doesn't address AI. It should.

The Hidden Risk

Your vendors are adopting AI rapidly. Customer data you share with them may be processed by AI systems you've never assessed. This creates compliance and security risk.

Assessment Questions

Add these to your vendor assessments:

  • Does the vendor use AI to process our data?
  • Which AI services/providers are used?
  • Is our data used for AI training?
  • What controls protect our data in AI processing?

Contract Provisions

Update contracts to include:

  • AI usage disclosure requirements
  • Restrictions on AI training with your data
  • Security requirements for AI systems
  • Incident notification for AI-related events

Ongoing Monitoring

  • Periodic reassessment of AI usage
  • Monitoring for new AI implementations
  • Review of AI-related incidents

Third-party AI risk is real. Start assessing it now.

MR
Michael Rodriguez
Compliance Director

Michael oversees compliance strategy at ZeroShare, helping organizations navigate the complex regulatory landscape around AI. He previously led compliance programs at Fortune 500 financial services firms and holds CISA, CISM, and CRISC certifications.

Regulatory ComplianceRisk ManagementFinancial Services

Stop AI Data Leaks Before They Start

Deploy ZeroShare Gateway in your infrastructure. Free for up to 5 users. No code changes required.

See Plans & Deploy Free →Talk to Us

This article reflects research and analysis by the ZeroShare editorial team. Statistics and regulatory information are sourced from publicly available reports and should be verified for your specific use case. For details about our content and editorial practices, see our Terms of Service.

We use cookies to analyze site traffic and improve your experience. Learn more in our Privacy Policy.