CFPB Circular Targets Workplace Monitoring Tools
On October 24, 2024, the Consumer Financial Protection Bureau (“CFPB”) issued a circular titled “Background Dossiers and Algorithmic Scores for Hiring, Promotion, and Other Employment Decisions.” In essence, the circular warns that the CFPB may view a host of workplace-related monitoring offerings — such as monitoring the number of messages sent by an employee, employee driving habits, the time it takes employees to complete tasks, web browsing activity, or keystroke monitoring, among others — as consumer reports (and their providers as consumer reporting agencies) subject to regulation under the Fair Credit Reporting Act (“FCRA”).
About the FCRA
The FCRA, and similar state statutes, regulate the furnishing and use of “consumer reports” provided by “consumer reporting agencies” for employment purposes and certain other permissible purposes. Traditionally, consumer reports obtained for employment purposes include reports such as credit reports, criminal history reports, motor vehicle reports, and reports verifying employment or educational history. Consumer reporting agencies have a host of contractual, accuracy, consumer access, and reinvestigation obligations under the FCRA. Similarly, employers using consumer reports have FCRA obligations including, for example, obligations to provide a disclosure that reports will be obtained, obtain written authorization to procure the reports, and obligations to provide pre-adverse and adverse action notices in the event that information in a consumer report before it may be used, or subsequently is used, in whole or in part to take an adverse action against the individual that is the subject of the report. In addition to regulatory enforcement, the FCRA has a private right of action that can result in individual and class actions.
Offerings of Potential Concern for the CFPB
The circular references traditional background screening activities, such as criminal history and other reports derived from public records, but the primary focus of the circular is on types of workplace monitoring that have not necessarily traditionally been treated as consumer reports. Examples of the types of offerings that are among those of concern included in the circular, as well as prepared remarks that CFPB Director Chopra delivered at an event on workplace surveillance with the Acting Secretary of Labor on the day the circular was released, include:
- Reports “that record current workers’ activities, personal habits and attributes, and even their biometric information. For example, some employers now use third parties to monitor workers’ sales interactions; track workers’ driving habits; measure the time that workers take to complete tasks; record the number of messages workers send and the quantity and duration of meetings they attend; and calculate workers’ time spent off-task through documenting their web browsing, taking screenshots of computers, and measuring keystroke frequency.”
- “Some companies may analyze worker data in order to provide reports containing assessments or scores of worker productivity or risk to employers. Today, such scores are used to make automated recommendations or determinations related to worker pay; predict worker behavior, including potential union organizing activity and likelihood that a worker will leave their job; schedule shifts or job responsibilities; or issue warnings or other disciplinary actions.”
- “If an employer purchases a report that details whether a worker was a steward in a union, utilized family leave, enrolled their spouse and children in benefits programs, was cited for poor performance, or was deemed to be productive, this can raise serious issues about privacy and fairness. And if this information is converted into some sort of score using an opaque algorithm, that makes it even more suspicious.”
- “In the healthcare context, consider a nurse required to wear a badge that tracks their movement throughout their shift. A hospital might hire a monitoring company using AI to track metrics like time spent on patient care, by noting each time a nurse enters and exits a patient room.”
The circular notes in a footnote that these types of offerings could be powered by artificial intelligence or more traditional algorithms.
Distinguishing Prior Authority Regarding the Applicability to Certain Software
The circular’s legal discussion calls into question the continued applicability of earlier FTC opinions and case law that took the position that software providers were not consumer reporting agencies under certain circumstances because they were not assembling or evaluating information that may be processed using the software. Assembling or evaluating is a key component of the definition of a “consumer reporting agency.” According to the CFPB, “significant changes in the software and general technological landscape” have made earlier guidance “inapplicable to many of the kinds of technology used today,” noting that today’s software developers “often take a more active role in providing ongoing services to clients, such as performing ongoing maintenance of the software,” or licensing the software rather than selling it, activities that the CFPB opines in the circular may be assembling or evaluating the information processed through the software. The circular also posits that an organization could “assemble or evaluate” information by collecting it to train an algorithm that scores or otherwise evaluates individuals for employment purposes.
Exceptions
The FCRA includes several exceptions to the definition of a “consumer report.” One of these exceptions — related to certain workplace misconduct investigations and compliance with law — is not the “focus” of the circular. Another exception to the definition of a consumer report, for information relating solely to a person’s transactions or experiences with the consumer is recognized in the circular, but the circular emphasizes that this exception is only applicable to information “solely” from those transactions or experiences. The circular states that the exception “by its own terms” does not apply “to a report containing information not about transactions or experiences between the report-maker and the consumer, such as when the report includes algorithmic scores.”
Takeaways
According to the CFPB its circulars are intended to signal the CFPB’s approach to enforcement, including to other agencies that have enforcement authority, rather than create new legal requirements. Future cases will be fact dependent and whether reviewing courts will accept some of the positions staked out by the CFPB in the circular in the event of enforcement actions remains to be seen. The circular nevertheless is a strong signal of the CFPB’s intent to increase its scrutiny of such offerings, as well as its potential enforcement efforts, in this area. If the CFPB successfully implements the position taken in the circular, it could significantly impact the use of various monitoring and other technologies in the workplace. Organizations that offer services such as those identified in the circular and in CFPB Director Chopra’s remarks, as well as employers that use them, should consider reviewing these products to (re)assess potential FCRA applicability.
If you have any questions about this issue, please contact AGG Privacy & Cybersecurity attorneys and Background Screening team members, Kevin Coy or Erin Doyle.
Related Services
Related Industries
- Kevin L. Coy
Partner
- Erin E. Doyle
Associate