Axel Hoehnke
AI Governance Assessment

Find
ShadowAI

Your organisation is already using AI tools that no one approved. The question is not whether—it is how many, where, and what data they are touching.

// shadow_ai_scan.log ◉ LIVE
ChatGPT (OpenAI API)CRITICAL
Perplexity AIHIGH
Notion AI (OAuth)HIGH
GitHub CopilotMEDIUM
Grammarly (Extension)MEDIUM
Google Gemini (Personal)HIGH
AI grammar tool (ext.)LOW
7Tools Found
3High / Critical
0Approved
01

Visibility Failure

Shadow AI is not a compliance failure. It is a visibility failure. Leadership believes AI usage is limited. Reality is different—by a wide margin.

02

Data Exposure

Employees paste confidential documents, source code, and customer datasets into external AI tools daily—often without realising the implications.

03

Regulatory Gap

ISO 42001 requires an AI system inventory. NIS2 mandates supply chain oversight. Unmanaged AI usage directly undermines both obligations.

04

OAuth Infiltration

AI tools authorised via OAuth silently access email, calendars, and cloud files. These integrations are invisible without identity log analysis.

05

Developer Channels

Engineering teams embed AI SDKs and APIs into production systems without security review—creating AI-in-the-loop without governance or oversight.

06

Audit Exposure

Auditors are beginning to ask for AI inventories. Organisations without one face findings. Organisations with unknown shadow AI face much worse.

Ten-Phase
Discovery Process

A structured, non-disruptive assessment combining network telemetry, identity logs, endpoint audit, developer monitoring, and employee interviews. No agents installed. No productivity interrupted.

PH 01

Define Scope

Establish a working definition of shadow AI for your organisation and document which AI platforms are currently approved. Everything else becomes a candidate for investigation.

ScopingBaseline
PH 02

Network & DNS Telemetry

Analyse DNS requests and HTTPS traffic for connections to AI provider endpoints. Identify unexpected outbound flows from endpoints, microservices, and CI/CD pipelines.

CASB / SSEFirewall LogsDNS
PH 03

SaaS & OAuth Discovery

Review identity provider authorisation logs (Okta, Entra ID) for third-party AI app permissions, unusual data scopes, and automated data export to AI services.

OktaEntra IDOAuth Audit
PH 04

Endpoint & Browser Extension Audit

Enumerate installed browser extensions, local AI assistants, and desktop copilots. Identify extensions that transmit corporate data to external AI APIs.

EDRExtension Inventory
PH 05

Developer Ecosystem Monitoring

Review repository audit logs, dependency manifests, and cloud service logs for AI SDK additions, API keys issued to AI providers, and LLM calls in production code.

GitHubSCACI/CD
PH 06

DLP Signal Analysis

Review Data Loss Prevention alerts for uploads of confidential files, prompts containing proprietary data, and unusual copy-paste activity to AI tool domains.

DLPData Classification
PH 07

Workflow Interviews

Short structured interviews with department representatives to surface unofficial AI workflows, automation scripts, and AI integrations invisible to technical monitoring.

QualitativeCross-department
PH 08

Risk Classification

Each discovered AI tool is classified by data sensitivity, system role (assistive vs. decision-making), provider risk, and operational impact. Low / Medium / High / Critical.

Risk ScoringISO 42001
PH 09

Third-Party AI Provider Review

Assess vendor data retention policies, training data usage, geographic data processing, and contractual controls for the most significant AI providers identified.

Vendor RiskData Residency
PH 10

Governance Integration

Transition from discovery to action: AI allow-list, prompt data policies, enterprise AI gateway recommendations, and an AI awareness training brief for staff.

PolicyAllow-ListTraining

Assessment
Deliverables

Every engagement produces a structured, audit-ready set of documents. Evidence-based findings, no more, no less than what you need to act.

Shadow AI System Register

Complete inventory of discovered AI tools including source, data exposure level, authorisation status, and risk classification.

Risk Classification Report

Each tool scored across four dimensions: data sensitivity, system role, provider risk, and operational impact. Prioritised action list included.

Gap Analysis vs. ISO 42001 / NIS2

Mapping of identified shadow AI usage against ISO/IEC 42001 AI management system requirements and NIS2 supply chain obligations.

Governance Roadmap

Prioritised recommendations: AI allow-list, prompt data policy, enterprise gateway options, and an awareness training brief for employees.

Executive Summary

One-page board-ready summary of findings, risk exposure, and top three immediate actions. Suitable for presentation to leadership or supervisory boards.

30-Day Follow-Up Session

A structured review session four weeks after delivery to assess progress, answer questions from internal stakeholders, and refine the governance approach.

Governance,
Not Prohibition

Shadow AI is rarely malicious. In most cases it signals productivity demand exceeding governance frameworks. Organisations that treat it purely as a security problem struggle. Those that treat it as a governance and workflow transformation challenge succeed.
— Assessment methodology principle, Axel Hoehnke Consulting

The deeper question is not simply where shadow AI exists—but why employees feel the need to bypass official tools in the first place. Discovery reveals this. Governance addresses it.

Right-Sized
For Your Context

This assessment is calibrated for organisations under active regulatory pressure—not for organisations seeking to build an enterprise security operations centre.

CISOs & Security Leaders

  • Need an AI inventory before the next ISO 42001 audit
  • Suspect shadow AI but lack the telemetry to prove it
  • Board is asking about AI risk posture
  • NIS2 supply chain review due within 12 months

GRC & Compliance Teams

  • ISO 42001 implementation underway, AI inventory missing
  • DPO requesting AI data flow documentation for GDPR
  • Preparing for NIS2 essential entity notification
  • Need defensible evidence, not anecdotal estimates

SME Leadership

  • No dedicated CISO but facing CRA or NIS2 obligations
  • Employees using consumer AI tools with company data
  • Customer or partner due diligence requiring AI governance
  • Prefer practical guidance over theoretical frameworks

Mapped to the
Frameworks That Matter

Findings are structured to feed directly into your existing compliance obligations. One discovery exercise, multiple framework benefits.

ISO/IEC 42001
Satisfies AI system identification and inventory requirements of the AI management system standard.
NIS2 Directive
Addresses supply chain security obligations and third-party AI service risk under Article 21.
EU Cyber Resilience Act
Supports software dependency visibility including AI components embedded in products with digital elements.
ISO/IEC 27001
Feeds shadow AI findings into asset management (A.5.9), supplier relationships (A.5.19), and access control controls.
GDPR / BDSG
Identifies unrecorded processing activities and third-country data transfers triggered by external AI tool usage.

Know What
Is Already Running

The assessment is structured, time-bounded, and produces audit-ready output. A 30-minute scoping call is sufficient to determine whether your organisation is the right fit.

Book a Scoping Call meet@axelhoehnke.com
Our Services
Quick Wins NIS2 Readiness Check CRA Scope Assessment Board Risk Briefing Regulatory Radar NIS2 Implementation CRA Compliance Programme Incident Response Plan Supply Chain Risk Virtual CISO Retainer NIS2 Audit Support CRA Technical File 84Watchdog Shadow AI