A recent lawsuit against Eightfold AI claims its hiring scores rely on opaque inferences from external data, raising Fair Credit Reporting Act concerns. The case highlights a turning point for AI recruiting, pushing the industry toward transparent, explainable, and candidate-visible scoring—an approach SquarePeg was built around from day one.
.jpg)
In 1970, Congress passed the Fair Credit Reporting Act to stop companies from compiling dossiers on people and selling them to employers. People were being denied jobs based on information they never saw, couldn't correct, and didn't know existed.
This week, Eightfold AI, which offers an applicant scoring tool, got sued under the same premise.
The lawsuit claims Eightfold builds candidate profiles using data applicants never provided, predicts their personality traits, scores their 'likelihood of success,' and delivers those predictions to employers as match scores. The legal argument: this makes Eightfold a Consumer Reporting Agency, subject to the same disclosure and dispute rules as background check companies.
Nine out of ten employers now use AI in hiring. This case matters because it forces a question most haven't considered: what separates a tool that helps evaluate candidates from one that unfairly profiles them? That line will shape how AI hiring tools are built and regulated for years.
The plaintiffs are two job applicants who applied to roles at PayPal and Microsoft, and allege that Eightfold:
The core legal argument: if a vendor assembles external data to evaluate a candidate's "character, general reputation, personal characteristics, or mode of living" for employment decisions, they may be producing a consumer report under FCRA. And consumer reports come with compliance requirements such as standalone disclosures, candidate authorization, dispute mechanisms, and the right to see and correct your file before adverse action.
Eightfold’s marketing provides the plaintiffs’ ammunition. Their materials tout “Personality Insights,” including whether a candidate is a “team player, introvert, [or] extrovert,” and predictions about “future career trajectory.” They claim a patent over a system that creates “enriched talent profiles,” including “Talent Insights (e.g., quality of education, career growth, progression, skills depth, industry experience)” and “Personality Insights (e.g., team player, introvert, extrovert, chess or equivalent player, high endurance athlete).”
This is where the plaintiffs’ case gains traction. When an algorithm is alleged to assign labels like “introvert” based on data a candidate never provided, and employers use those outputs to filter applicants, the gap between what the candidate submitted and what they’re being judged on becomes a problem. That’s the heart of the complaint: candidates were evaluated on criteria they couldn’t see, challenge, or correct.
The Fair Credit Reporting Act was passed in 1970 to address a specific problem: third parties were assembling private files on people and selling them to employers, landlords, and lenders. People were being denied opportunities based on information they never knew existed and couldn't dispute.
The FCRA’s definition of "consumer report" is broad. It covers any communication bearing on a consumer's "character, general reputation, personal characteristics, or mode of living."
In 2024, the Consumer Financial Protection Bureau issued guidance specifically addressing "Background Dossiers and Algorithmic Scores for Hiring." The CFPB clarified that an entity can become a Consumer Reporting Agency if it "collects consumer data in order to train an algorithm that produces scores or other assessments about workers for employers," especially when that data comes from "sources other than an employer receiving the report, including from other employer-customers or public data sources."
Regulators are telling the industry that the rules that have applied to background check companies for 50 years also apply to automated hiring tools.
Eightfold will likely argue that employers control the review process, that candidates consent through privacy policies, and that their system is a "tool" rather than a reporting agency. These arguments have worked in other contexts.
But in public materials, Eightfold emphasizes that its AI makes independent judgments using proprietary data the candidate never provided: predicted personality traits, inferred skills, likelihood of success based on historical patterns.
This isn't the first case of its kind. In Mobley v. Workday, the EEOC argued that Workday could be held liable as an 'employment agency' because its tools automatically reject or advance candidates. The focus there was anti-discrimination law, but the principle is the same: if software makes hiring decisions, it's subject to the same rules as humans who make hiring decisions.
The regulatory direction is clear. Even if this specific case faces procedural hurdles, the underlying guidance isn't going away. Hiring tools need to be designed with transparency, accountability, and fair hiring at the core, not bolted on after the fact. Legal scrutiny is a good thing for employers and candidates alike. It forces clarity on what data can and should be used to make hiring decisions.
This lawsuit isn't an indictment of AI in hiring. It's an indictment of a specific approach: assembling data candidates didn't provide, making predictions they can't see, and automating decisions they can't challenge.
SquarePeg is built differently.
SquarePeg exists because the signal-to-noise problem in recruiting has gotten out of control. Application volumes are surging. Resumes are keyword-optimized to match job descriptions. Mass-apply tools have removed friction and intent from applications. Fraud is rampant. Even the most diligent recruiter can't give thousands of applicants the thorough review they deserve: reading the full resume, researching past employers, understanding context.
AI is particularly well suited for this problem, and when done right can make hiring fairer. Every application can be reviewed in full, with relevant context and nuanced consideration of a candidate's full work history. It can remove heuristics that favor brand-name companies or universities, or whatever happens to catch a recruiter's eye in a five-second skim. From the start, SquarePeg took a glassbox approach to support employer decisions, not automate them based on hidden factors.
Candidates are scored on resume data they submitted and publicly available company information displayed alongside their work history. Employers define the requirements and approve every criterion before scoring begins. Scores reflect matches to those employer requirements for that specific job, not black-box predictions about future performance or personality.
Every score is explained with cited evidence from the candidate's application. Any requirements that could hint at bias are not allowed, and we're audited monthly for this. We don't infer personality or behavioral traits, only skills and experience based on work history.
When SquarePeg deduces something (this person likely knows Python based on listed experience with Python libraries), the reasoning is shown. Fraud detection surfaces factual anomalies for human review. No auto-rejections. No character judgments.
Everything that goes into a score is visible, traceable, and explainable.
This lawsuit may be good for the industry. Recruiting isn't working well right now. The arms race between mass-apply tools and crude resume filters isn't serving the market. If legal pressure pushes candidates toward providing authentic, substantive information about their experience, and pushes vendors toward building tools with clear rationale and audit trails, that's progress.
Regulators and courts are starting to scrutinize how HR tech vendors source and use candidate data. The conversation around AI in hiring is maturing toward more transparency, more accountability, and more equity. The tools that survive will be the ones designed with those principles from the start.
Experience SquarePeg live and see how we streamline recruiting, rank top talent, and save you hours every hiring cycle.
