As of today, any mortgage seller or servicer working with Freddie Mac must comply with new AI governance requirements. This isn't guidance or best practice. It's a hard deadline that changes how AI-enabled vendors and tools are overseen across the entire loan lifecycle — from origination to servicing.
Freddie Mac added these requirements to its Seller/Servicer Guide in March 2025, then formalized the effective date in Bulletins 2025-16 and 2025-17 last December. Any mortgage with an application received date on or after March 3, 2026 must comply.
What the Framework Requires
The new requirements center on three principles: transparency, accountability, and ethical stewardship.
In practice, sellers/servicers must demonstrate:
Risk management: How you identify, measure, and manage AI risks across your organization — not as a one-time assessment, but as an ongoing program.
Transparency: Clear understanding of how your AI systems work, what data they use, and how decisions are made. No black boxes.
Accountability: Documented oversight showing who's responsible for AI governance, how decisions get escalated, and where the buck stops.
Ethical standards: Processes ensuring AI is used responsibly, fairly, and in compliance with all applicable regulations.
Audit trail: The ability to prove all of the above when Freddie Mac reviews you.
The Vendor Problem
Here's where most organizations are exposed: their vendors.
AI is already embedded across the mortgage lifecycle in ways that aren't always obvious. Your LOS may use machine learning for workflow optimization. Document automation tools use AI to extract and classify data. Income verification, fraud detection, chatbots, appraisal tools, pricing engines — all increasingly depend on predictive models and automated decisioning.
And critically: Freddie Mac hasn't defined "artificial intelligence" or "machine learning" in the Guide.
That ambiguity creates real governance risk. If your vendor calls something a "feature" rather than a "model," you're still on the hook for figuring out whether it qualifies as AI. And under Freddie Mac's framework, responsibility doesn't transfer to vendors. It stays with you.
The question isn't whether your vendors are using AI. It's whether you can demonstrate governance over it. Can you explain what data is being used? How decisions are made? How errors or bias are monitored? Who's accountable when outcomes go sideways?
For most sellers/servicers, the honest answer is some version of "no" or "we assume so."
The Real Cost of Getting This Wrong
The consequences are concrete:
Financial risk: If you can't demonstrate compliant AI governance, Freddie Mac can decline to purchase the loan or require it to be returned. When a loan is reviewed and AI-enabled tools played a role, Freddie Mac will look for evidence that governance was in place at the time of delivery. Missing, inconsistent, or incomplete documentation isn't a technical problem — it's a delivery requirement failure.
Operational risk: Teams end up reconstructing vendor approvals after the fact. Information is scattered across emails and spreadsheets. What should be a straightforward explanation turns into uncertainty — and uncertainty doesn't hold up under review.
Competitive disadvantage: Sellers/servicers with centralized governance can adapt to new requirements without slowing down. Others lose time and momentum, not because they lack expertise, but because their processes don't scale.
What Good Looks Like
Organizations positioned for this deadline share common traits:
Vendor visibility: They know exactly which vendors are using AI, what those systems do, and how they're governed.
Standardized processes: Every vendor goes through the same due diligence. Every assessment asks the same questions. Every gap gets flagged the same way.
Real-time status: They can tell you at any moment which vendors have been assessed, which are in progress, which are overdue.
Centralized documentation: When Freddie Mac asks for proof of AI governance, they pull a report — not a scavenger hunt through email.
Proactive monitoring: They're not waiting for annual reviews. They get alerts when certifications expire, when vendors report changes, when new risks emerge.
The Bigger Picture
Freddie Mac's requirements aren't really about AI. They're about whether your compliance and third-party risk programs are keeping pace with change.
If this update came as a surprise, that's a signal. It suggests your compliance management program isn't consistently surfacing regulatory changes that affect how you operate. Early visibility matters: it creates time to assess impact, assign ownership, and translate requirements into policies before deadlines drive decisions.
March 3rd isn't the finish line. It's a checkpoint. And regulators aren't waiting for comprehensive federal AI legislation — they're building AI governance into existing oversight frameworks. Mortgage lending is first because it already has heavy documentation requirements. Healthcare, financial services, and government contracting are all moving in the same direction.
For founders building B2B tools that touch regulated industries, this is the leading edge of a compliance wave.
Trish @ StackDrift
Found this useful? Forward it to a founder who's too busy to read TOS (so... every founder).
Got a vendor you want us to track? Reply to this email.


