Skip to content
Menu
Menu

Fannie Mae Mandates AI Governance Framework For Mortgage Lenders

The new requirements direct lenders to inventory, monitor, and manage risks associated with AI and machine learning systems used across mortgage operations.

 

Fannie Mae sent formal governance requirements to its lender base, setting expectations of how AI and machine learning (ML) models are used across the mortgage lifecycle, including underwriting, fraud detection, customer service, and operational decision-making.

Lenders must document every AI system they use, including what it does, what data it relies on, and how it influences decisions. Lenders must also establish governance policies that define roles and responsibilities for oversight, as well as procedures for ongoing monitoring, testing, and validation of model performance.

Further, lenders must identify and manage AI risks, including poor data quality, performance changes over time, and unexpected outcomes. The guidance also calls for controls to ensure compliance with applicable laws and regulations, including those related to fair lending and consumer protection.

Lenders must be able to explain how they make AI decisions and maintain records of the building, testing, and updating of AI models.

Freddie Mac introduced similar expectations in prior guidance, including requirements that lenders implement governance frameworks for AI and ML systems used in mortgage-related activities.

Fannie Mae did not specify an implementation deadline but indicated that lenders are expected to incorporate the framework into their existing risk and compliance systems. The agency said it will assess adherence to the requirements through its ongoing oversight and review processes.

Essential AI Risk Intelligence

Daily insights on AI governance, regulation, and enterprise risk management. Trusted by Chief Risk Officers and compliance leaders globally.

By subscribing, you agree to receive our daily newsletter. Unsubscribe anytime.

Advertise with AI RIsk Today, Today!