Skip to content
Menu
Menu
1280x250

Freddie Mac Updates Selling Guide to Require AI Governance Policies

Government-sponsored enterprise will mandate lenders using artificial intelligence to implement oversight frameworks in loan origination and servicing.

 

Freddie Mac announced revisions to its Single-Family Seller/Servicer Guide that will require mortgage sellers and servicers to establish formal governance policies for the use of artificial intelligence and machine-learning technologies. The changes, outlined in Bulletin 2025-16, take effect March 3, 2026.

Under the updated guidance, lenders that use AI or machine learning in connection with mortgages sold to or serviced on behalf of Freddie Mac must demonstrate that they have comprehensive processes in place governing the deployment and oversight of these systems. The policies must address risk management, transparency, and compliance with applicable laws.

Freddie Mac’s bulletin specifies that seller/servicers must show clear procedures for “mapping, measuring, and managing” risks associated with AI tools, and that these procedures be transparent and effectively implemented across their organizations.

The revisions apply to AI and machine-learning systems used in loan origination, underwriting, servicing, and other areas that affect loans sold to Freddie Mac. Firms affected by the mandate will need to adjust internal practices and documentation to meet the new requirements before March 2026.

Freddie Mac did not provide specific enforcement details but indicated that compliance with the AI governance framework will be part of the eligibility criteria for selling loans to the government-sponsored enterprise.

The move by Freddie Mac follows broader industry and regulatory attention on the use of AI in the housing finance sector. In early December, the U.S. Government Accountability Office urged the Federal Housing Finance Agency, which oversees Freddie Mac and Fannie Mae, to issue more explicit guidance on fair-lending requirements related to AI use, citing potential risks tied to automated systems in homebuying and mortgage lending.

The FHFA’s 2025 Scorecard previously directed both enterprises to strengthen AI risk management frameworks and to assess the benefits and risks of expanded use of the technology in the mortgage market.

Industry analysts have noted that lenders increasingly rely on AI and machine-learning tools for tasks such as credit assessment, fraud detection, and automated underwriting, prompting regulators and market participants to focus on governance and accountability practices. The Freddie Mac update aligns with these developments by formalizing expectations for the responsible deployment of AI among its seller and servicer partners.

Affected lenders and servicers are reviewing internal systems to prepare for the March compliance deadline, adjusting policies and documentation to meet Freddie Mac’s new requirements ahead of the effective date.

Essential AI Risk Intelligence

Daily insights on AI governance, regulation, and enterprise risk management. Trusted by Chief Risk Officers and compliance leaders globally.

By subscribing, you agree to receive our daily newsletter. Unsubscribe anytime.