ACTIVE US

GLBA Financial Privacy Guide for Banks

The foundational US financial privacy law — with 2023 Safeguards Rule amendments that explicitly address AI and machine learning security requirements.

What GLBA Is and Why It Matters for AI

The Gramm-Leach-Bliley Act (GLBA) has been the foundational privacy and security law for US banks and financial institutions for over 25 years. Enacted in 1999, GLBA created the first comprehensive federal framework for protecting customer financial information and has served as a model for state and international privacy laws. The Federal Trade Commission (FTC) and federal banking regulators have continuously updated GLBA's implementing rules, most significantly through the 2023 amendments to the Safeguards Rule, which modernized security requirements to address evolving threats like ransomware, data breaches, and the growing use of artificial intelligence in financial services. Compliance with GLBA is non-negotiable for any institution subject to its scope.

GLBA comprises three main components: the Financial Privacy Rule, the Safeguards Rule, and the Pretexting Protection rule. These rules work together to create a baseline privacy and security standard for financial institutions and their service providers. The Privacy Rule governs how financial institutions collect, use, and disclose nonpublic personal information (NPI). The Safeguards Rule requires financial institutions to implement comprehensive security programs to protect customer information. The Pretexting Protection rule prohibits fraudulent attempts to obtain customer information by impersonation or deception. GLBA applies to banks, credit unions, savings and loan associations, investment companies, insurance companies, and other entities that engage in financial activities. Most critically for modern financial services, GLBA applies whether information is stored digitally or in paper form, and whether it is processed by humans or by automated systems including artificial intelligence.

The 2023 amendments to the Safeguards Rule represent a watershed moment for GLBA compliance in the age of AI. The FTC recognized that traditional information security practices were insufficient against modern threats and against the risks posed by increasingly complex machine learning systems. The amended rule introduced new mandatory requirements for financial institutions that did not exist in prior versions: a designated qualified individual responsible for the security program, documented risk assessments, mandatory encryption, multi-factor authentication, comprehensive access controls, and regular monitoring and testing of security measures. Institutions using AI and machine learning to process customer data must now ensure that those systems are explicitly evaluated for security risks and that vendors implementing AI systems are appropriately scrutinized and monitored.

The Three Rules in Detail

The Financial Privacy Rule requires all financial institutions to provide customers with clear privacy notices explaining what information the institution collects, how it is used, and with whom it may be shared. The rule restricts the use and disclosure of NPI. Most significantly, the rule grants customers the right to opt out of the sharing of NPI with non-affiliated third parties, with limited exceptions for service providers, joint marketing partners, and regulatory purposes. For banks, this means you cannot broadly share customer financial data with unaffiliated companies for marketing or commercial purposes without allowing customers to opt out. The Privacy Rule applies even to information shared within an affiliated group of companies, though affiliated sharing is treated somewhat more permissively than third-party sharing. For banks using customer data to train AI models, develop new products, or engage in analytics, the Privacy Rule creates obligations to notify customers and, in some cases, obtain opt-out rights.

The Safeguards Rule requires financial institutions to maintain a comprehensive information security program designed to protect the security, confidentiality, and integrity of customer information. The 2023 amendments made this much more prescriptive. Institutions must now designate a qualified individual responsible for overseeing the entire information security program. This person must have sufficient authority and independence to oversee all aspects of security, including vendor management, incident response, and access controls. Institutions must conduct regular, documented risk assessments to identify threats and vulnerabilities. These risk assessments must include evaluation of both internal systems and third-party service providers. The rule mandates specific technical controls including encryption of customer information in transit and, in many cases, at rest. Multi-factor authentication is required for any individual accessing customer information. Access controls must limit access to information based on need. Institutions must implement monitoring and testing procedures to ensure controls remain effective. Institutions must also have procedures for detecting, investigating, and responding to security events. Vendors and service providers handling customer information must be contractually obligated to comply with equivalent security standards.

The Pretexting Protection rule makes it illegal to obtain customer information by impersonating a customer, a financial institution, or a government authority. This rule has become increasingly relevant as social engineering attacks have become more sophisticated. Financial institutions must implement policies and training to prevent employees from disclosing customer information to unauthorized persons, and must investigate any suspected pretext attempts.

What Qualifies as Nonpublic Personal Information (NPI)

GLBA defines nonpublic personal information as any information concerning an individual's finances that is not publicly available. This is a broad definition that includes account numbers, transaction history, credit scores, income information, employment details, identification numbers (like Social Security numbers), and contact information. NPI also includes information derived from financial transactions, such as risk assessments, credit ratings, or behavioral profiles created through analysis of customer activity. For AI and machine learning purposes, this means that training data derived from customer transaction histories qualifies as NPI, and any model outputs that create inferences about customer financial status, creditworthiness, or risk also fall within the scope of NPI. Information combined or linked with NPI such that the collection could reasonably be used to identify an individual also qualifies as NPI.

Public information (information that is lawfully publicly available and has not been derived from customer information) is not covered by GLBA, but the practical scope of "public" is narrow. A financial institution cannot simply publish customer data and then claim it is no longer NPI. Additionally, even aggregate or de-identified information may still be regulated if it can be re-linked to individuals.

Customer Notification and Opt-Out Rights

Financial institutions must provide privacy notices to customers clearly explaining their privacy practices. These notices must be provided when the customer relationship begins and annually thereafter. The notice must explain the categories of information collected, the uses of that information, the categories of third parties with whom information may be shared, and the customer's right to opt out of sharing with non-affiliated third parties. The notice must be clear and conspicuous, not buried in fine print.

Customers have the right to opt out of the disclosure of NPI to non-affiliated third parties. This opt-out right is fundamental. If a customer elects to opt out, the financial institution cannot share that customer's information with unaffiliated entities except in limited circumstances: to service providers acting on behalf of the institution, for joint marketing arrangements, and for regulatory and law enforcement purposes. The opt-out mechanism must be clear and easy to use. A financial institution cannot condition the provision of services on a customer waiving the right to opt out, except in limited cases where the service itself requires the disclosure.

For banks using AI to develop products or generate insights, the Privacy Rule creates ambiguity in certain respects. If a bank develops a predictive model using customer transaction history and then uses that model to target marketing to customers based on inferred creditworthiness or investment sophistication, the FTC has suggested that using customer data internally for analytics does not trigger the opt-out requirement, but this guidance is evolving. The safest approach is to treat internal AI uses as aspects of information security and risk management that should be clearly disclosed to customers and, where appropriate, subject to opt-out rights.

The FTC Safeguards Rule Update (2023)

The FTC's November 2023 final amendments to the Safeguards Rule represent the most significant update to GLBA in the modern era. The amendments address three major gaps in the prior rule: lack of clarity around accountability for security failures, insufficient specificity regarding required security measures, and inadequate requirements for managing third-party security risks. Under the updated rule, financial institutions must appoint a qualified individual (not a committee or department, but a named, accountable person) to oversee the security program. This individual must be a senior manager with direct access to executive leadership and the board of directors. The qualified individual must have authority over decisions related to information security investments, policies, and personnel.

The amended rule requires documented risk assessments at least annually, with more frequent assessments when circumstances change significantly (such as implementing a new AI system, acquiring a company, or experiencing a security incident). Risk assessments must identify threats and vulnerabilities, evaluate the likelihood and potential impact of security failures, and document the institution's response. Crucially, risk assessments must include evaluation of third-party service providers, particularly those with access to customer information or those implementing new technologies like AI or cloud computing.

Technical controls are now more explicitly mandated. The rule requires encryption of customer information both in transit and, in most cases, at rest. Multi-factor authentication is mandatory for anyone accessing customer information. Access controls must be implemented to limit access based on job function and the principle of least privilege. Session timeouts, password complexity requirements, and logging of access are all part of the expected control environment. The rule requires continuous monitoring and periodic testing to ensure controls are functioning. Institutions must have audit logs and the ability to detect unauthorized access or unusual activity. Vulnerability scanning and penetration testing are implicit expectations.

The rule also strengthened requirements for managing service providers and vendors. Financial institutions must conduct due diligence on vendors before engaging them, assess their security practices, and contractually require equivalent security standards. Institutions must monitor ongoing vendor security and have the right to audit vendors. This is particularly critical for banks using cloud computing, outsourced AI development, or third-party data analytics services.

Interaction with AI Data Usage

GLBA does not prohibit financial institutions from using AI, but it requires that AI systems be evaluated and managed as part of the comprehensive security program. When a bank implements an AI model to make credit decisions, assess fraud, predict customer behavior, or set pricing, that model processes NPI and must be subject to the same security and privacy protections as any other information system. This means the AI model itself must be evaluated for security vulnerabilities, the training data must be protected as NPI, the model outputs must be treated as NPI, and the vendors or internal teams implementing the model must comply with security requirements.

Additionally, GLBA's Privacy Rule requires transparency about how customer information is used. If a bank uses customer data to develop an AI model that makes inferences about creditworthiness or risk, that use should be disclosed in privacy notices. Customers should have an opportunity to understand how their data feeds into AI systems and to opt out if those systems fall outside the scope of the original service the customer agreed to.

Penalties and Recent Enforcement

Violations of GLBA carry serious penalties. For financial institutions, the FTC can impose civil penalties of up to $100,000 per violation, and federal banking regulators can impose additional enforcement action including cease-and-desist orders and reputational penalties. For individuals (including executives, board members, and compliance officers), violations can result in criminal penalties of up to $10,000 in fines and up to five years in prison in the case of criminal violations involving intentional or reckless disregard for customer privacy or security. Class action lawsuits by customers are also possible when breaches occur. Additionally, state attorneys general have begun bringing GLBA enforcement actions, creating additional liability exposure.

The FTC and federal banking regulators have significantly increased GLBA enforcement in recent years. Several major banks have received orders requiring them to enhance their security programs, particularly around third-party vendor management and AI implementation. The FTC has signaled that institutions using AI to make decisions about customers must ensure that the AI systems are secure, that decision-making is transparent, and that customer data used in AI training is adequately protected. The FTC has also been active in enforcing data breach notification requirements under the amended Safeguards Rule, which now mandates that institutions notify the FTC of "notification events" affecting 500 or more customers within 30 days of discovery.

How Corvair Helps

Corvair helps financial institutions comply with GLBA's Privacy and Safeguards requirements, particularly in the context of AI and machine learning. Our platform provides tools for cataloging nonpublic personal information across systems, tracking data flows and third-party recipients, automating privacy notice generation aligned with actual practices, and managing customer opt-out requests at scale. For the Safeguards Rule, Corvair supports security risk assessment workflows, tracks security control implementation and testing, monitors third-party vendor security, and maintains audit trails demonstrating compliance with the qualified individual requirements and documentation obligations.

Schedule a Briefing

Related Regulations

CCPA/CPRA

California privacy rules that supplement — and in some cases override — GLBA for financial institutions.

Read guide

FCRA

Fair Credit Reporting Act requirements governing credit information use, adverse action notices, and AI in lending.

Read guide

ECOA & Fair Lending

Equal credit opportunity requirements that apply to AI-driven lending decisions alongside GLBA privacy rules.

Read guide