The ODDITY Tech Lawsuit Reveals What Algorithmic Disclosure Actually Requires

Earlier this month, Hagens Berman filed a securities fraud lawsuit against ODDITY Tech, alleging that the company made misleading disclosures about the efficiency of its AI-driven advertising platform. ODDITY's shares have fallen approximately 49% following revelations that the platform's revenue performance did not match what investors were led to believe. The lawsuit centers on a specific claim: that ODDITY represented its AI capabilities in ways that obscured the structural limitations of those systems from the people whose decisions depended on accurate information about them. This is not primarily a story about corporate fraud. It is a story about what happens when the gap between algorithmic awareness and structural understanding becomes a legally material fact.

Disclosure as a Coordination Problem

Securities disclosure requirements assume that accurate information, transmitted clearly, enables rational decision-making. This is a classical coordination assumption: competence is held ex ante by the receiver, and the sender's obligation is simply to provide accurate signals. The ODDITY case challenges that assumption directly. The plaintiffs are not arguing that investors lacked access to information. They are arguing that the framing of that information created false schemas about how the underlying system actually functioned. This is a coordination failure of a different kind, one where the problem is not information asymmetry in the conventional sense but schema asymmetry between the entity operating an algorithmic system and the stakeholders who must evaluate it.

Kellogg, Valentine, and Christin (2020) identify a comparable dynamic in workplace algorithm deployment: the people subject to algorithmic systems frequently develop what the authors call "folk theories" rather than accurate structural models of how those systems operate. Folk theories are impressionistic, individually constructed, and resistant to correction because they feel locally consistent even when structurally wrong. ODDITY's investors, based on the complaint's allegations, appear to have been operating on folk theories induced by the company's own disclosures rather than on accurate schemas of the platform's architecture and limitations.

The Awareness-Capability Gap in Investor Context

Algorithmic literacy research has consistently demonstrated what I have called the awareness-capability gap: knowing that an algorithm exists, and even knowing something about its general design, does not translate into the capacity to predict or evaluate its outputs (Gagrain, Naab, and Grub, 2024). This gap is well-documented among platform workers. The ODDITY case suggests it extends to capital market participants in ways that current disclosure frameworks do not adequately address.

The standard remedy proposed in securities litigation is better disclosure, meaning more accurate factual statements. But if the underlying problem is schema deficit rather than factual omission, additional disclosure may not resolve the coordination failure. Investors can be told that an AI advertising platform uses probabilistic targeting, that performance is subject to distributional variance, and that revenue projections carry model-dependent uncertainty, and still construct folk theories that dramatically overestimate reliability. The Gentner (1983) framework on structure-mapping suggests why: people map new systems onto familiar analogical structures, and when those source structures are inaccurate, additional surface-level facts do not correct the underlying mapping. The schema, not the individual data point, governs interpretation.

What Structural Disclosure Would Actually Require

The practical implication of this analysis is uncomfortable for securities law and corporate governance simultaneously. If folk-theory formation is the mechanism generating investor harm, then adequate disclosure requires something closer to schema induction than factual reporting. Companies would need to communicate not just what their AI systems do but how the structural constraints of those systems create variance in outcomes, where the power-law distributions in performance emerge, and what boundary conditions govern the relationship between platform inputs and revenue outputs. Sundar (2020) describes this as the challenge of communicating machine agency: human receivers systematically misattribute the source and character of algorithmic outputs, assigning stability and intentionality where there is variance and optimization.

ODDITY is not a unique case. It is an early, visible instance of a problem that will recur as more companies build revenue models on AI platform performance and then face the disclosure requirements designed for human-operated systems. Schor et al. (2020) note that platform dependence creates structural precarity precisely because the platforms' internal logic is not legible to the people whose outcomes it governs. That analysis was developed for gig workers. The ODDITY litigation suggests the same opacity problem scales to institutional investors and public markets.

The Governance Gap This Exposes

The more important question this lawsuit raises is not whether ODDITY's executives made materially false statements. Courts will resolve that question. The more important question is whether the current disclosure architecture, built around factual accuracy rather than schema adequacy, is capable of producing the coordination outcomes it was designed to achieve in an environment where the systems being disclosed are algorithmically complex. Based on what the ALC framework predicts about schema formation and transfer, the answer is that factual accuracy is a necessary but insufficient condition for adequate disclosure. Closing the awareness-capability gap in investor communication will require a fundamentally different conception of what disclosure is supposed to accomplish.