What to Ask Before Letting AI Touch Client Data

AI can save your brokerage hours a week. But if you hand client data to the wrong vendor, the downside is career-ending. Here's your due diligence checklist.

← Back to Blog

The Stakes Are Higher Than You Think

As a Canadian mortgage broker, you're sitting on some of the most sensitive data in consumer finance. Tax returns, bank statements, payslips, IDs, credit histories. When a vendor pitches an AI tool that processes this faster, the upside is obvious. What's less obvious is what happens to that data once it leaves your hands.

A data breach in mortgage broking isn't a PR headache. It's a compliance violation, a loss of client trust, and potentially the end of your licence. If you want a deeper dive into what AI compliance actually looks like in practice, read our broker's guide to AI that passes compliance. But first, here are the questions you should be asking.

The Checklist: 10 Questions Every Broker Should Ask

1. Where is my client data stored?

Get a specific answer. Which cloud provider? Which region? If they say "the cloud" and can't name an infrastructure provider or data centre location, walk away. You need to know whether data stays onshore or gets routed through overseas servers. Jurisdiction determines which privacy laws apply.

2. Is data encrypted at rest and in transit?

Both matter. Encryption in transit (TLS) protects data moving between systems. Encryption at rest protects it on their servers. Ask for the specific standard. AES-256 is the benchmark. If they can't name their encryption protocol, they either don't have one or don't understand why it matters. Neither is acceptable.

3. Is client data used to train AI models?

This is the big one. Many AI platforms feed user inputs back into training data by default. Your client's financial documents could be influencing outputs for other users or surfacing in unexpected ways. The answer you need is a firm no, backed by a contractual guarantee, not just a settings toggle.

4. Who has access to my data inside your organisation?

You want to hear "as few people as possible" with specifics. Role-based access controls, audit logs, and the principle of least privilege should be standard. If every engineer on their team can see your client files, that's a problem regardless of how good the product is.

5. What independent security measures do they have in place?

Ask vendors about their security practices, independent audits, and any certifications they hold. A vendor that has undergone third-party security assessments demonstrates a commitment to protecting your data beyond what they claim on their website. Look for evidence of regular penetration testing, vulnerability scanning, and formal security policies. If a vendor has no independent verification of their security controls and no plan to pursue any, that tells you where security sits on their priority list.

6. What happens in the event of a data breach?

You want a documented incident response plan. How quickly will they notify you? What's their containment process? Who is responsible? Notifiable data breaches carry real obligations for you as the broker. You need a vendor whose response timeline supports your compliance requirements, not one that leaves you scrambling.

7. Is there human review before AI-generated outputs reach my clients?

The best AI tools for broking are human-in-the-loop: the AI drafts, extracts, or formats, and the broker reviews before anything goes out. Ask whether the platform requires your sign-off before output, or fires off communications autonomously. An unsupervised AI sending incorrect information to a client is a compliance event.

8. Can I delete my data, and will it actually be purged?

You should be able to request deletion at any time, and the vendor should confirm data is purged from all systems, including backups, within a defined timeframe. If data lingers indefinitely after you cancel, you've lost control of your clients' information.

9. How do you handle third-party subprocessors?

Your vendor might be airtight, but what about their vendors? If they use third parties to process or store data, you need to know who they are and what security standards they meet. The chain is only as strong as its weakest link.

10. Will you put all of this in writing?

Any vendor that answers these questions confidently should have no problem formalising them in a data processing agreement. If they hesitate when you ask for contractual commitments, the verbal assurances are worthless.

A good AI vendor won't be annoyed by these questions. They'll expect them. If your due diligence makes them uncomfortable, that's the only answer you need.

How We Handle This at LendFrame

We built LendFrame knowing that brokers cannot afford to get data security wrong. Client data is encrypted at rest and in transit. It is never used to train models. Access is role-based and logged. Human-in-the-loop is the default, so nothing reaches your clients without your sign-off. Every commitment is backed in writing. You can review our full approach on the LendFrame security page.

These should be table stakes for any vendor in this space. If your current tools can't answer the ten questions above, it's worth asking why.

The Bottom Line

AI adoption in broking is accelerating and the productivity gains are real, especially with automation tools designed for mortgage workflows. But speed without security is reckless. Ten minutes with this checklist could save you from a breach, a compliance investigation, or a destroyed client relationship. Ask the hard questions. Only work with vendors who welcome them.

See Our Security Approach

LendFrame is built with privacy-first AI: no data training, no retention, full compliance with Canadian regulations.

View Our Security Approach