You’ve probably heard this recently
“Is ChatGPT GDPR compliant?”
“Can we upload supporter data to an AI tool?”
“What if someone’s already done it?”
If these questions are floating around your team — or worse, going unspoken — you’re not alone.
AI tools are already showing up inside charities. Sometimes officially, often quietly. And the real risk right now isn’t the tech. It’s the lack of clarity around how to use it safely.
This blog fixes that.
Caption describing the image
AI and GDPR: Yes, They Can Co-Exist
Let’s start here: using AI does not automatically breach GDPR.
But the moment you involve personal data — especially names, emails, donation history, or anything relating to a service user — you’ve crossed into compliance territory.
And if your team hasn’t been briefed or trained, there’s a real risk that someone’s already shared data they shouldn’t have.
So here’s what you need to know to stay safe, stay compliant, and start exploring AI with confidence.
Caption describing the image
GDPR And AI - Fast Facts
What is GDPR
UK GDPR protects how you collect, store, and use personal data. It applies to charities just as much as it applies to private companies — in fact, probably more, given the nature of the data many charities handle.
Key principles include:
- Transparency — tell people how their data is used
- Minimisation — only collect what’s needed
- Purpose limitation — don’t use data for something new without consent
- Security — protect against loss or unauthorised access
- Accountability — be able to prove you're following the rules
Where AI Fits
Using AI doesn’t exempt you from GDPR — and in some cases, it increases your exposure.
- If you feed personal data into a tool like ChatGPT or Gemini, you’re transferring that data to a third party, often outside the UK/EU.
- If the tool stores that data or uses it for model training, If you feed personal data into a tool like ChatGPT or Gemini, you’re transferring that data to a third
- party, often outside the UK/EU.
- If the tool stores that data or uses it for model training, you’ve potentially breached GDPR.
But if you’re using AI to work with non-personal data (like drafting general content, internal planning docs, or rewriting a blog post), there’s no issue.
The One Line That Matters
If it can identify a person, it’s personal data.
That includes:
- Names
- Emails
- Postcodes (when tied to a name or donor record)
- Phone numbers
- Behavioural history (e.g. “supporter who gave £25 last year”)
- Case notes or service history
If in doubt, treat it as personal and keep it out of public AI tools.
1. Don’t paste personal or sensitive data into public AI tools
Never upload real donor lists, service user notes, or email addresses to tools like ChatGPT Free or Gemini. Treat prompt boxes like public forums.
2. Use GDPR-compliant tools where possible
Look for platforms with EU-based storage, logging controls, and clear Data Processing Agreements (DPAs). Best bets: Microsoft Copilot, ChatGPT Team, Claude for Teams.
3. Keep a human in the loop
AI is fast — not flawless. Never let it publish or send anything externally without human review. Especially when it touches public messaging or regulated services.
4. Add AI to your data policy
Your internal policies need to be updated to cover:
- What’s in scope
- What tools are approved
- What data types are off-limits
- Who to talk to before trying something new
Short on time? Use our plug-and-play AI & Data Policy Template for charities
5. Train your team
The biggest risk isn’t the platform — it’s quiet, unregulated “DIY AI” use. Make it safe to ask questions. Give people clarity. Shut down the shadow tools before they become a breach.
Legal & Technical Deep Dive
What the Law Actually Says
This bit matters if you want to show due diligence — especially with trustees or legal teams:
- If your charity uses an AI tool that processes personal data, that tool is a data processor under Article 28 of UK GDPR.
- That means you need a formal Data Processing Agreement (DPA) in place with the vendor.
If the AI application involves:
- Special category data (e.g. health, ethnicity, vulnerable users)
- Automated decision-making (even partial)
- Profiling or behavioural analysis
Then you may need to complete a Data Protection Impact Assessment (DPIA).
Articles worth knowing:
- Article 5: Principles of processing
- Article 6: Lawful basis
- Article 25: Privacy by design
- Article 32: Security
- Article 35: DPIAs (when required)
- Article 28: Third-party processors (like AI tools)
Bottom line: if you wouldn’t hand your donor list to a third-party freelancer without a contract, don’t hand it to an AI model without safeguards.
AI Tools: Which Ones Are Safer?
Real-World Examples of GDPR-Safe AI Use
Final Words
You’re Closer Than You Think
Most charities already operate with high awareness of GDPR, safeguarding, and data ethics. If that’s you — you’re already 80% there.
What you need now is:
- A short AI policy
- A clear list of approved tools
- Training for your team
- A sense-check for any new AI project involving data
Don’t let uncertainty become paralysis.
Don’t assume “we can’t use AI” — assume you can, as long as you do it right.
AI Strategy Services
Our AI Strategy Pack is designed to help your business confidently navigate the AI landscape—identifying clear, actionable opportunities to integrate AI where it matters most. With 77% of businesses now exploring or investing in AI, those with a defined strategy are set to lead (McKinsey, 2023).
This service supports you in identifying impactful use cases, prioritising initiatives, and aligning AI adoption with your business goals. You'll walk away with a tailored AI roadmap, opportunity scorecard, and key recommendations across tools, workflows and governance—empowering your teams to take action.




