Accidental Architects: What "Help, I Think I'm Building an AI Tool to Lay Off My Coworkers" Reveals About Competence, Complicity, and Organizational Control

The Story That Should Not Be Surprising

A recent piece circulating in business media captures a phenomenon that deserves more careful theoretical attention than it has received. Employees report discovering, mid-project, that the AI tools they are being asked to build or refine appear designed to automate away the roles of their colleagues - and possibly their own. The emotional register of these accounts is one of shock and moral discomfort. From an organizational theory perspective, the shock is the interesting part. The outcome itself is not novel. What is novel is the mechanism by which organizations are now achieving it.

Why This Is Not Simply About Job Displacement

The standard framing of AI and labor focuses on displacement as an outcome: jobs are lost, roles are restructured, headcount declines. That framing is analytically incomplete here. What the reported cases describe is something structurally distinct - a situation in which the worker producing the displacement tool is positioned inside the organization, using organizational resources, without full visibility into the strategic purpose of what they are building. This is not automation imposed from outside. It is automation co-produced by the people it will eventually affect. Rahman's (2021) concept of the invisible cage is instructive: algorithmic control systems derive much of their power from the fact that workers cannot fully perceive the architecture of constraint they inhabit. The employees in these accounts are not just inhabiting that cage - they are, unknowingly, helping to assemble it.

The Awareness-Capability Gap, Applied Inward

My dissertation research on Algorithmic Literacy Coordination draws a consistent distinction between algorithmic awareness and algorithmic capability (Kellogg, Valentine, and Christin, 2020). Workers can know that algorithms govern their outcomes and still lack the structural schema needed to respond effectively. The cases described in this news story suggest a related but distinct problem: workers may possess technical capability - they can build the tool - while lacking the organizational schema needed to interpret what the tool is actually for. This is not a failure of coding skill. It is a failure of structural reading. The employee can produce the artifact; they cannot yet read the artifact's organizational function. Gentner's (1983) structure-mapping theory is relevant here. Transfer of understanding requires not just surface feature recognition but mapping of relational structures across domains. An employee who understands machine learning pipelines but lacks a schema for how organizations use efficiency tools to manage headcount decisions will not spontaneously recognize the relational structure connecting their work to a layoff strategy.

The Organizational Design Logic Behind the Pattern

It would be a mistake to attribute this entirely to individual schema deficits. The organizational design logic here is deliberate, even if not always consciously articulated. When organizations decompose a strategic initiative into discrete technical tasks and distribute those tasks across workers without providing the integrative frame, they produce plausible deniability at multiple levels. The employee who built the automation module did not design the layoff. The manager who scoped the project did not make the headcount decision. The executive who approved the headcount decision did not specify the technical implementation. Hatano and Inagaki's (1986) distinction between routine and adaptive expertise is useful for diagnosing what this organizational structure produces: workers with deep routine expertise in their technical domain and minimal adaptive expertise in reading organizational intent. The distribution of tasks across competence silos is not incidental to the strategy - it is constitutive of it.

What This Means for How We Think About Platform and Organizational Coordination

The ALC framework I am developing treats platforms as environments where competencies develop endogenously through participation. What this news story forces me to consider is that the same logic applies inside organizations adopting AI tooling at scale. The employee building the AI workflow tool is participating in an algorithmically-mediated environment whose full structure is not disclosed to them. They are developing technical competence through participation while remaining structurally illiterate about the coordination logic their participation serves. Schor et al. (2020) describe dependence and precarity in platform economies as partly constituted by information asymmetries between platform and worker. The internal organizational case described here suggests that this asymmetry is not exclusive to gig platforms. It can be reproduced inside conventional employment relationships when organizations treat AI tool development as a technical task rather than a strategic communication. The employees who are most troubled by what they are building are, ironically, the ones who have achieved enough structural literacy to recognize the relational pattern. That recognition itself is a form of adaptive expertise that their organizations have not asked them to develop - and may have an interest in suppressing.

References

Gentner, D. (1983). Structure-mapping: A theoretical framework for analogy. Cognitive Science, 7(2), 155-170.

Hatano, G., and Inagaki, K. (1986). Two courses of expertise. In H. Stevenson, H. Azuma, and K. Hakuta (Eds.), Child development and education in Japan (pp. 262-272). Freeman.

Kellogg, K. C., Valentine, M. A., and Christin, A. (2020). Algorithms at work: The new contested terrain of control. Academy of Management Annals, 14(1), 366-410.

Rahman, H. A. (2021). The invisible cage: Workers' reactivity to opaque algorithmic evaluations. Administrative Science Quarterly, 66(4), 945-988.

Schor, J. B., Attwood-Charles, W., Cansoy, M., Ladegaard, I., and Wengronowitz, R. (2020). Dependence and precarity in the platform economy. Theory and Society, 49(5-6), 833-861.