Your Staff's AI Notetaker Is Probably Listening to Things It Shouldn't
I discovered last Tuesday that my colleagues have been inviting robots to our meetings. Not the helpful kind that might fetch coffee or file expenses. The other kind. The kind that sits there, silently recording everything, then sends a transcript to servers in California.
The robot in question was called Fireflies. Or possibly Otter. Or one of the seventeen other AI notetakers that have apparently colonised British business meetings while I wasn't paying attention.
What I didn't realise—and I suspect many business owners share my blissful ignorance—is that these tools don't just record the person who installed them. They record everyone. Every participant. Every aside about the client being difficult. Every muttered observation about Dave from accounts. Every accidentally-shared screen showing something you'd rather not have documented in perpetuity on a server farm in Oregon.
Why your staff keep inviting robots to meetings
The thing is, your staff aren't being malicious. They're being efficient. Someone read an article about productivity (possibly on LinkedIn, which should have been the first warning sign) and thought: wouldn't it be marvellous if I never had to take notes again?
And it would be marvellous. In the same way that it would be marvellous if we could all fly, or if hangovers didn't exist, or if the M25 occasionally moved at a reasonable pace.
But we live in a world with the ICO. And the UK GDPR. And the rather important principle that you need consent before recording someone's voice and shipping it overseas for processing.
Here's what actually happens in most small businesses. Sarah in marketing installs Fireflies because she saw a TikTok about it. Fireflies joins the next client call as Fireflies Notetaker. Nobody questions this because everyone assumes someone else approved it. The client's confidential restructuring plans are now on servers governed by US data protection law. Nobody tells IT because IT is also Dave from accounts, and he's busy. This continues for six months until someone notices or complains.
I've spoken to sysadmins who describe this as shadow IT. I prefer accidental corporate espionage, but I accept that's less catchy for the LinkedIn posts.
The bit where I get serious about UK law
I'm going to do something I rarely do now, which is be serious for a moment. Feel free to skip ahead if you prefer the jokes, but someone needs to say this.
Under UK GDPR, recording someone's voice is processing personal data. Full stop. Not sometimes or it depends. Their voice is biometric data. The words they say are personal data. The transcript that gets generated is personal data. This means you need a lawful basis to record them. You need to tell them you're recording—and actually get acknowledgment, not just hope they noticed the bot joining. You need to know where that data is going. You need to have a data processing agreement with the provider. And you need to consider international transfers if the data leaves the UK.
Most AI notetaker services are based in the US. Most process data in the US. Some offer EU hosting. Almost none offer UK-specific hosting. This matters because the UK has its own adequacy decisions and data transfer rules post-Brexit, and we just assumed it was fine is not a defence the ICO finds compelling.
The potential fine for getting this wrong? Up to £17.5 million or 4% of global turnover. For a small business, that's not a fine. That's an extinction event.
What's probably running wild in your business right now
Based on conversations with people who audit these things for a living (a job I would not wish on anyone), here's what's commonly found roaming wild in British businesses. High risk: Otter.ai, Fireflies.ai, Fathom, tl;dv, Grain—all US-based with limited controls. Medium risk: Microsoft Copilot, Google Meet transcription, Zoom AI Companion—at least these live in your existing tenant. Lower risk: your own approved, contracted solution with a proper DPA, or nothing at all, which has the added benefit of forcing people to actually pay attention in meetings.
I'm not saying the high-risk tools are bad. Some of them are genuinely excellent at what they do. I'm saying that letting them proliferate without any oversight is the corporate equivalent of leaving your front door open and being surprised when strangers wander in.
Five things you can actually do about this
So what do you actually do about this?
1. Find out what's already there
Before you can fix the problem, you need to know its size. Check browser extensions across company devices. Review OAuth connections to Microsoft 365 and Google Workspace. Look at calendar integrations—notetaker bots often request calendar access. Search email for joined as notetaker or similar. Check Zoom, Teams, and Meet admin panels for third-party app approvals. You'll probably find more than you expected. Try not to panic. Panic is unproductive, and besides, you're British.
2. Make a decision
You have three options. Option A: ban everything. Simple, defensible, and will make you unpopular with people who've grown attached to their transcripts. Probably necessary if you handle particularly sensitive data. Option B: approve specific tools. More work upfront, but more sustainable. Pick one or two tools, do proper due diligence, get a DPA in place, configure them properly, and make everything else forbidden. Option C: pretend this isn't happening. I don't recommend this, but I acknowledge it's the most popular choice.
3. Write a policy
It doesn't need to be long. Here's one you can steal—I won't tell anyone. Approved tools for meeting recording and transcription: [list them]. All other AI recording, transcription, or notetaking tools are not approved for business use. Before using any approved tool: inform all meeting participants, obtain acknowledgment, do not record external parties without explicit consent, do not use for HR or legal discussions without additional approval. Want to try a new tool? Contact IT first. Fourteen lines. You can fit that on a single page and still have room for a corporate logo that nobody looks at.
4. Tell people
A policy that lives in a SharePoint folder nobody opens is not a policy. It's a liability with extra steps. Send an email. Mention it in the next all-hands. Put it in the onboarding pack. Make it clear this isn't about being difficult—it's about not accidentally sending client data to foreign servers without anyone's knowledge or consent. Most people, when they understand the actual risk, will comply. The ones who won't are the ones you need to watch anyway.
5. Check your built-in options
Before your staff go hunting for third-party tools, make sure they know what you already have. Microsoft 365 Business and E3 have Copilot, Teams transcription, and Stream. Google Workspace has Meet transcription and recording. Zoom has AI Companion and native recording. These aren't perfect, but they're in your control, in your tenant, and covered by your existing agreements. That's a significant improvement over Sarah found something on Product Hunt.
The uncomfortable truth
The uncomfortable truth is that these tools are genuinely useful. I've used transcripts myself. They're particularly good at capturing the bit of the meeting where someone says something important and everyone else is thinking about lunch.
But useful and appropriate for uncontrolled deployment across your organisation are different things. My car is useful. I still wouldn't let the intern drive it to client meetings without checking they had a licence first.
If you do nothing else, send an email this week. Something like: If you're using any AI tools that join meetings, record calls, or transcribe conversations, please let IT know. We need to make sure we're handling data properly. No one's in trouble—we just need to know what's out there.
You'll be amazed what surfaces. And horrified. But mostly amazed.
The notetaker revolution is here. It's just that nobody remembered to check whether we wanted to be revolutionised.
More Opinions
IBM Says 2026 Is the Year Quantum Computing Finally Beats Classical. I Think They're Right.
Europe Thinks It Lost the Internet and the UK Is About to Learn Why
£210 Million for Government Cyber Security. Good. But Let's Talk About What Actually Matters.