CFPB Spotlights the FCRA’s Next Regulatory Frontier: Data Brokers

 

August 31, 2023

Privacy Plus+

Privacy, Technology and Perspective

CFPB Spotlights the FCRA’s Next Regulatory Frontier: Data Brokers. This week, let’s look at the Consumer Financial Protection Bureau’s (CFPB’s) recent announcement that it means to interpret data brokers who sell certain types of personal information as “consumer reporting agencies,” subject to the Fair Credit Reporting Act (FCRA).

Why This Matters: The FCRA is a federal law that governs how consumer information, especially credit data, is collected, shared, and used. Its goal is to ensure that this information is accurate, fair, and private, and it gives consumers the right to access and correct their own credit reports.  If data brokers were to fall under the FCRA, they would face significantly higher costs and burdens for ensuring the accuracy of the data they sell.  Additionally, they would have to allow consumers to access, dispute and correct information about themselves, and to place limitations on how this data can be used by third parties, potentially reducing its market value. 

Context:  The CFPB views its announcement as part of the broad struggle to address the challenges posed by Artificial Intelligence (AI) in decision-making processes. The agency recognizes the potential risks of using AI, especially in sensitive areas like credit assessments or employment decisions.  AI can pose harm, such as targeting vulnerable people for financial scams or exposing survivors of domestic violence. While the CFPB expects other privacy-protection initiatives to arise at the state and federal level, Director Rohit Chopra emphasizes the importance of leveraging existing laws, primarily the 1970 FCRA, to safeguard consumers.

Specifics: No specific Rule has been proposed yet. The CFPB’s announcement came early this month through Director Chopra’s remarks at a White House “Roundtable on Protecting Americans from Harmful Data Broker Practices.” The Director announced a rulemaking process to develop rules “to prevent misuse and abuse by these data brokers,” two proposals of which may include (1) defining a data broker’s sale of data concerning, “for example, a consumer’s payment history, income, and criminal records as a consumer report [triggering the FCRA]” (our emphasis), and (2) “clarifying” the extent to which “credit header data” such as names, DOBs, and SSNs constitute “consumer reports” as well. (Since such “credit header data” is how much data is accessed, the CFPB believes that would reduce the ability of credit reporting companies “to impermissibly disclose sensitive contact information that can be used to identify people who don’t wish to be contacted, such as domestic violence survivors.”)  

Timing:  The idea of regulating data brokers under the FCRA promises to be extremely controversial and any formal rule is a long way from being finalized. The CFPB plans to release an initial outline of proposals in September, with a particular interest in feedback from small businesses.  The agency aims to make the proposed rule available for public review and comment in 2024. 

Our Thoughts:  We think closer attention to privacy considerations in the data brokerage industry would be a good thing, especially in the “Age of AI.” It may be that the 53-year-old FCRA may be the right vehicle for this, or at least the best available, but we are less certain about that since the FCRA is so badly in need of modernization. 

In our view, the role of predictive decision-making in various aspects of life ranges from being mildly irritating to downright infuriating and dangerous, unless, of course, the automated decision turns out in our favor.  We concur that human attention should be required before finalizing decisions in some sensitive areas. This principle is enshrined in Article 22 of the EU General Data Protection Regulation (GDPR), which stipulates that individuals have the right not to be subject to a decision based solely on automated processing if those decisions have a significant legal effect on them. Stateside, New York City has even restricted the use of automated employment decision tools (AEDT) tools in hiring.

Regardless, implementing such a substantial regulation through the administrative rulemaking process, as opposed to legislative action, remains a subject of much debate, especially at this time when the U.S. Supreme Court seems eager to confront questions about limitations on the authority of administrative agencies.

You can read CFPB Director Chopra’s remarks by clicking on the following link:

https://www.consumerfinance.gov/about-us/newsroom/remarks-of-cfpb-director-rohit-chopra-at-white-house-roundtable-on-protecting-americans-from-harmful-data-broker-practices/

---

Hosch & Morris, PLLC is a boutique law firm dedicated to data privacy and protection, cybersecurity, the Internet and technology. Open the Future℠.

 

 

Previous
Previous

Ed Tech Privacy and Security Alert: FTC Obtains a New Consent Order against Edmodo

Next
Next

SEC Adopts New Cybersecurity Disclosure Rules for Public Companies