Free Resource

The AI Compliance Checklist

12 plain-English questions. Answer honestly. Every "no" is a gap — and every gap is either a fine, a breach claim, or an ICO inquiry waiting to happen. Most founders get 4 or fewer "yes" answers first time through.

How to use this: Answer each question honestly. A "no" or "I don't know" is the same — both mean exposure. Tap Yes or No on each question to track your score live.
Your score
—/12
  1. 01

    Do you have a documented list of every AI tool your business uses that handles personal data?

    If you can't list them, you can't assess them. The ICO's first question in any investigation is: tell us what data you process and where. "We use a few AI tools" is not an answer.

  2. 02

    For each of those AI tools, have you identified a lawful basis under UK GDPR?

    "We're using it for our business" is not a lawful basis. Neither is "we assumed it was fine." Each process that touches personal data requires a documented, defensible basis — consent, contract, legitimate interest, or one of the others. The basis shapes everything else you do.

  3. 03

    Do your AI vendor contracts include a Data Processing Agreement (DPA)?

    If the vendor processes personal data on your behalf, a DPA is a legal requirement under UK GDPR — not optional. Most standard SaaS agreements don't include one. Clicking "agree" to a terms of service is not a DPA.

  4. 04

    Have you read your AI vendors' terms to check whether they use your inputs to train their models?

    Many AI platforms claim broad rights to use your data for model improvement. If your inputs contain client data, personal data, or commercially sensitive information, you may have already breached your own client agreements — and your clients don't know.

  5. 05

    Have you updated your privacy notice to disclose that you use AI in your data processing?

    Data subjects have the right to know how their data is used. If AI is in the processing chain and your privacy notice doesn't say so, you're in breach — regardless of whether anyone complains about it.

  6. 06

    Do you have a Record of Processing Activities (ROPA) that includes your AI tools?

    Formally required for organisations over 250 people, strongly recommended for all. More importantly: you cannot demonstrate compliance without one. A ROPA is the document the ICO asks for first. Most businesses don't have one that reflects reality.

  7. 07

    Do your client contracts reference how you use AI with their data?

    If you pass client data through AI tools, your contracts need to say so — ideally with explicit permission. Silence is not consent. Clients who discover you've been running their data through third-party AI tools without telling them have a legitimate complaint, whether or not it causes harm.

  8. 08

    Have you assessed whether any of your AI processes require a Data Protection Impact Assessment (DPIA)?

    Automated decision-making, large-scale processing, use of novel technology, and processing of sensitive data categories can all trigger the DPIA requirement. Getting this wrong isn't a technicality — it's one of the ICO's stated enforcement priorities for AI.

  9. 09

    Are you ICO registered, and does your registration accurately describe your current processing activities?

    ICO registration is a legal requirement for most UK businesses that process personal data. But registration alone isn't enough — the description must match what you actually do. If you've added AI tools since you last updated your registration, you're technically non-compliant.

  10. 10

    Do the people in your business who use AI tools understand what they can and cannot put through them?

    The ICO can pursue individuals as well as organisations. If a team member puts personal data into an AI tool that isn't cleared for it, that's a breach — and your business is liable. "I didn't know" is not a defence for the organisation, and it's not always a defence for the individual either.

  11. 11

    If you have EU customers, partners, or data flows, have you assessed your exposure under the EU AI Act?

    The EU AI Act is in force and applies to any AI system affecting EU individuals — regardless of where the business is based. Post-Brexit, many UK founders assume this doesn't apply to them. In most cases with EU-facing operations, it does. The prohibited practices became enforceable in February 2025.

  12. 12

    If a client submitted a data subject access request today, could you identify everything your AI tools have processed about them?

    You have 30 days to respond to a DSAR. Most businesses with AI tools in the processing chain have no way to answer this question with confidence. This is not a theoretical risk — DSAR volumes have been rising consistently since 2021, and AI-related DSARs are the fastest-growing category.

What your score means: 10£12 yes: genuinely strong position — book a session to verify the gaps aren't hidden. 6£9 yes: real exposure in specific areas. 0£5 yes: significant risk across the board. Either way, the next step is the same — understand exactly what your exposure looks like before the ICO does.

Get your answers reviewed — free

Tell us your score and the areas where you answered "no" or "I don't know." Naz will review and send back a plain-English summary of your actual exposure and what addressing it involves. No pitch. No obligation.

No marketing. Just a direct response from Naz within 24 hours.

Seen enough to know you have a problem?

Book a free 30-minute clarity session. We'll go through your actual exposure and tell you exactly what addressing it involves.