The class action filed last month against Eightfold AI may be a defining moment for employers relying on algorithmic hiring. The lawsuit alleges that while employers are legally responsible for providing disclosures, obtaining written authorization, and issuing pre-adverse‑action notices, under the Fair Credit Reporting Act (FCRA) and the Investigative Consumer Reporting Agencies Act, those steps never occurred for the plaintiffs.
Why, then, is the lawsuit directed at Eightfold rather than the employers who failed to provide those notices? The complaint claims that Eightfold acted as a consumer reporting agency (CRA) by compiling applicant data, including information from public sources, online activity, and inferred traits and generating ranking‑style Match Scores for employers. Because of this alleged CRA role, Eightfold had its own independent legal duties, such as obtaining certain employer certifications before furnishing the reports. The plaintiffs argue Eightfold violated those duties at the moment it created and transmitted its reports, meaning the alleged misconduct occurred upstream, before employers ever had the chance to meet their own obligations.
If courts agree these AI‑generated outputs qualify as consumer reports, FCRA obligations could extend to a wide array of AI‑driven hiring platforms across industries. For employers, this could shift AI hiring from “emerging technology” to “regulated technology” almost overnight.
Disclaimer: This communication is for general informational purposes only and does not constitute legal advice. The summary provided in this alert does not, and cannot, cover in detail what employers need to know about the amendments to the Philadelphia Fair Chance Law or how to incorporate its requirements into their hiring process. No recipient should act or refrain from acting based on any information provided here without advice from a qualified attorney licensed in the applicable jurisdiction.

