Algorithmacy and the Co-optation of the Subject
A 20-minute talk arguing that algorithmic coordination produces a new cognitive competency — algorithmacy — through co-optation, tracing its genealogy from the administrative and monetary ruptures of 1968 to the platform economy.
Introduction: Three Registers of Coordination
Coordination requires communication. That sentence sounds trivial, but it is important to recognize that the form of communication determines the form of coordination.
Consider a simple coordination situation: you have a banana. I want it.
Physical register, oracy: you hand me the banana. The coordination object and the communication about it share the same space. Both parties are present. Both observe the transaction. Communication and coordination are coextensive.
Symbolic register, literacy: I write you a promissory note for the banana. The symbol separates from the referent. Coordination operates at distance. But both parties can inspect the mediating artifact. The note does not change between writing and reading.
Constitutive register, algorithmacy: I tap a button. An algorithm matches me with a provider, sets the price, routes the delivery, and evaluates the outcome. The banana still arrives. But the communicative act did not represent the transaction. It brought the transaction into existence.
When you tap an app, a hidden intermediary arranges the service, and neither party fully observes the transformation. The competency required to navigate this triadic structure is what I am calling Algorithmacy.
This is a unique inversion: in oracy, coordination produces communication; in literacy, communication enables coordination; but in algorithmacy, communication is coordination. This constitutive register restructures the characteristic operations of thought.
My thesis is threefold:
- Structural: The shift from the dyad to the triad, from the symbolic to the constitutive register, represents a fundamental rupture in social geometry.
- Historical: This rupture did not begin with ChatGPT or Silicon Valley; it is the technological industrialization of a legal and monetary shift, specifically the collapse of absolute property into administrative “status.”
- Mechanistic: The mechanism of this new competency is Co-optation. Unlike the disciplinary subject, the algorithmic subject is modulated by a feedback loop, learning to navigate the system by becoming the data it requires.
I. The Structural Deficit: Dyads and Triads
Graduating from the banana, consider how anyone ever got a job.
In an oral culture, hiring operated in the physical register. A dyad. You stood before a member of your community. The medium, speech, was transparent to both. The competency required was oracy.
In a literate culture, hiring operated in the symbolic register. A mediated dyad. You sent a resume. Writing separated you in space, but the artifact was accessible. The document you wrote was the document they read. The competency was literacy.
Today, you submit a file to a portal. An Applicant Tracking System parses it, extracts data, scores it against a hidden regression model, and presents a filtered representation to a manager who never sees your original text. The constitutive register. This is not a dyad. This is a triad.
The three registers map to a structural shift.
The qualitative difference between dyads and triads is not a matter of adding one more person. Simmel made this point in 1950: in the dyad, each party feels confronted only by the other, not by a collectivity above him. The withdrawal of one destroys the whole. A triad introduces a social geometry where the third party can mediate, profit from the conflict of the other two, what Simmel called the tertius gaudens, or actively foment conflict to maintain control, the divide et impera. The ATS occupies all three positions simultaneously and opaquely.
Our existing epistemic frameworks are failing because they are dyadic. They belong to the symbolic register. Digital Literacy assumes a subject using a tool. AI Ethics assumes a subject critiquing a model. Hermeneutics assumes a subject interpreting a text. None addresses the competency required to navigate a system that learns from you while you learn to navigate it.
Algorithmacy is the competency of navigating triadic coordination in the constitutive register.
II. The Genealogy: The Long 1968
Does the constitutive register begin with Artificial Intelligence? No. And the history of literacy tells us why.
Literacy did not begin with literature; it began with accounting. The proto-cuneiform tablets from Uruk, around 3200 BCE, were not for poetry. They were administrative instruments for grain inventories and labor rosters. Nissen, Damerow, and Englund showed in their 1993 study of archaic bookkeeping that the purpose of the earliest tablets was not to record language but to track economic administration through numerical notation. Schmandt-Besserat confirmed that for roughly five hundred years following its invention, writing served exclusively administrative functions. The cognitive revolution of literacy was a downstream consequence of an administrative revolution.
Algorithmacy follows the same trajectory. The triadic structure was not invented in Silicon Valley. It was established by legal theory and monetary policy in what I’ll call the Long 1968.
First: The Collapse of Property. Charles Reich argued in his 1964 Yale Law Journal article that the institution called property guards the troubled boundary between individual man and the state. That boundary collapsed in the twentieth century. Wealth was no longer “property,” a dyad between you and the land, the physical register. Wealth had become “status,” a triad between you, the administrator, and the license. Your professional license, your franchise, your benefits: these were not things you owned. They were forms of conditional access managed by an intermediary. Reich called the result a “new feudalism” where survival depended on maintaining eligibility within an administrative system whose criteria could shift without notice. The Supreme Court recognized the stakes within six years: Goldberg v. Kelly in 1970 relied directly on Reich’s framework to establish that welfare recipients possessed due process rights before benefit termination.
Second: The Collapse of Value. By 1968, the London Gold Pool collapsed under speculative pressure, initiating the chain of events that ended the Bretton Woods system by 1971. Money ceased to be a commodity. It became fiat, a triad between holder, issuing institution, and the credit structure. Simmel had already considered this logic in The Philosophy of Money in 1907: exchange operates not through direct comparison of objects but through the fact that each of them relates to a third quantity. Geoffrey Ingham in The Nature of Money in 2004 echoed that money is fundamentally a social relation rather than a commodity, and treating it as a physical object is a category error. Value became entirely dependent on the institutional framework of the intermediary.
In both cases, the Owner was replaced by the Claimant. The Owner protects what they have. The Claimant must constantly re-qualify for what they need. The physical register gave way to the constitutive register not through technology but through institutional restructuring.
AI did not create this structure. AI industrialized it.
Just as the printing press scaled Sumerian accounting logic to the masses, AI is scaling the administrative logic of 1968. What was once a relationship with the State, welfare and licenses, is now our relationship with Capital: hiring, credit, platforms. Citron and Calo documented in 2021 how automated systems now terminate Medicaid to cancer patients and deny food stamps through the same discretionary authority the New Property analysis sought to constrain. Rahman showed in 2018 that platforms function as new utilities controlling the terms of access to vital services.
III. The Mechanism: Co-optation
How does a subject inhabit the constitutive register? The answer is co-optation, which has its own history, but is newly revived by organizational theorists as a property of coordination.
Coordination theory traditionally recognizes three mechanisms: hierarchies, markets, and networks. Hierarchies coordinate through command, as Williamson formalized in 1975. Markets coordinate through contract: Coase argued in 1937 that the boundary of the firm exists where the costs of organizing an extra transaction internally become equal to the costs of carrying it out through the price mechanism. Networks coordinate through collaboration: Powell argued in 1990 that this constitutes a qualitatively distinct governance logic rooted in relational obligation, not a hybrid of market and hierarchy.
All three assume competence precedes participation. You know how to obey, buy, or trust before you enter. All three operate in the physical or symbolic register: actors bring existing capabilities to the coordination structure.
Stark and Vanden Broeck identified a fourth mechanism that violates this assumption in 2024: whereas actors in hierarchies command, in markets they contract, and in networks collaborate, on platforms they are co-opted. Vallas and Schor reached the same conclusion independently in 2020: platforms enroll participants through the design of the system itself, generating coordination without requiring traditional prerequisites. Co-optation is not a new concept, in fact entering organizational theory through Selznick’s 1949 study of the Tennessee Valley Authority, which showed that in order to manage political opposition from local elites, the TVA strategically enrolled these local actors into its formal governance structure, giving them advisory roles. This act of enrollment simultaneously diffused their resistance and aligned their interests with the federal program. The platform extension is that participation itself produces the competency required to participate.
No university offers a degree in “Algorithmic Resume Optimization.” The competency develops through the interface’s friction. The candidate learns, through rejection and silence, that column formatting breaks the parser, that keyword density affects ranking, that application timing affects position.
The loop is recursive. The candidate trains the system by providing data, and the system trains the candidate by shaping behavior. Faraj, Pachidi, and Sayegh described this in 2018: algorithms are performative in that their use shapes and alters work and organizational realities. This recursion has no analog in hierarchy, market, or network coordination. And it distinguishes the constitutive register from the symbolic: the resume in the symbolic register remained stable between writing and reading. The resume in the constitutive register is transformed by an intermediary that is itself transformed by the act of processing it.
Eslami and colleagues demonstrated in 2015 that more than half of Facebook users, 62.5%, were not aware of the News Feed curation algorithm’s existence at all, yet their behavior had already adapted to its logic. They were navigating a territory they did not know existed. Tacit, co-opted learning.
The cognitive competency is genuine. Cameron’s 2022 ethnography of ridehailing drivers documents sophisticated strategic reasoning: drivers distinguishing “relational” from “efficiency” strategies, calculating acceptance rate thresholds against surge pricing patterns. Shapiro identified in 2018 a reasoning style he termed “qualculation,” blending intuition with strategic calculation, distinct from the rational-actor model platforms assume. DeVito documented in 2021 how users build working models of algorithmic behavior through continuous cycles of sense-making, theory formation, testing, and revision, producing folk theories arranged in a hierarchy from functional awareness to structural causal models.
Oracy developed through immersion. Nobody designed oral culture. Literacy developed through instruction. Schools taught you to write. But the book did not adapt to you. Algorithmacy develops through co-optation. The intermediary is active, adaptive, and optimizing for objectives that are not the participant’s.
IV. From Postmodernism to Organization
We used Foucault to disempower the institution. We used Derrida to deconstruct the text.
Postmodernism is a clearing operation. It was the necessary demolition of the Enlightenment subject, the “Owner” of reason, to make way for what was coming next. But it cannot explain the reconstruction. It leaves us with fragments.
Organizational Theory can.
Deleuze diagnosed the shift three decades ago in his 1992 Postscript on the Societies of Control: enclosures are molds, distinct castings, but controls are a modulation. The individual was becoming the “dividual,” masses decomposed into samples, data, markets, or banks.
That was the metaphysics without the mechanism. The “dividual” is not a philosophical accident. It is an organizational necessity.
To manage the “New Property” at scale, the state had to turn the citizen into a file number. To manage fiat money at scale, the bank had to turn the borrower into a credit score. To manage platform coordination at scale, the algorithm must turn the worker into a behavioral vector. Each step moved coordination further from the symbolic register into the constitutive.
Stiegler named this process grammatization in his 2010 For a New Critique of Political Economy: the progressive discretization of continuous experience into elements that can be stored, reproduced, and manipulated by technical systems. Digital technology, he argued, represents the last stage of the grammatization process that started with writing. Each stage externalizes what was previously immanent to cognitive life. The ATS does not discover who is qualified. It defines qualification algorithmically. It produces the category of “employable” and sorts subjects into it.
Cheney-Lippold showed in 2011 how algorithms construct identity categories as “measurable types” through statistical classification, defining the meaning of categories while producing them. In his 2017 book We Are Data, he extended the analysis: these categories operate on dividuals, the decomposition of individuals into data clouds subject to automated integration and disintegration. Grammatization at computational speed.
V. The Co-optation of the Subject
If Literacy produced the Citizen, defined by rights and signatures, Algorithmacy produces the User, defined by access and behavioral vectors.
The User is not a stable identity. The User is a resource standing by for optimization. Heidegger called this Bestand, standing-reserve: a mode of revealing where beings show up not as things in themselves but as resources for extraction. Under Gestell, enframing, the Rhine ceases to be a river. It becomes a water-power supplier, set upon to yield energy that can be extracted and stored. The transformation is invisible to those inside it. The resource-character appears as the thing’s natural condition.
But unlike a river dammed for power, the User participates in their own extraction.
Jakesch and colleagues demonstrated this in a 2023 randomized experiment with 1,506 participants. Users who were given an opinionated AI writing assistant were twice as likely to write an essay agreeing with the AI, and significantly more likely to hold that opinion after the task concluded. They called it latent persuasion by language models. The participants were navigating a triadic structure, human writing through an algorithmic intermediary, and the intermediary reshaped their cognitive orientation through the practice of using it.
The literate subject used writing to express an interior self. The algorithmic subject uses the intermediary to produce a self the system will accept.
This co-optation is not uniform. Brian Street argued in Literacy in Theory and Practice in 1984 that competencies are always embedded in power relations. What counts as literacy is defined by the dominant group, and acquisition is inseparable from the social conditions producing it.
Sap and colleagues found in 2019 that content moderation algorithms flagged tweets written in African American Vernacular English as offensive at a rate 1.5 to 2 times higher than standard English. The competency required for a Black user to survive the triad involves a forced code-switching that a white user does not have to perform. Zhou and colleagues showed in 2025 that algorithmic competency depends on social support networks and cognitive job crafting, resources distributed unevenly across populations. Workers without access to peer networks cannot navigate the system and are penalized for it. Ravenelle’s 2019 ethnography of gig workers identified three typologies shaped by platform participation: “Success Stories” who leverage existing capital, “Strugglers” whose autonomy is absorbed by algorithmic requirements, and “Strivers” who maintain a foot in both worlds. These are cognitive orientations produced through differential co-optation, not personality types. Kellogg, Valentine, and Christin argued in 2020 that algorithmic management constitutes a new contested terrain of control, creating occupational identities and behavioral patterns absent prior to the platform.
The “User” is not a universal subject. The User is a stratified subject, constituted differently based on their distance from the training data’s norm.
VI. Closing
The history of literacy is the history of accounting becoming culture. The history of algorithmacy is the history of administration becoming cognition.
The Sumerian scribe’s grain ledger became, over millennia, the infrastructure of literate consciousness. Reich’s administrative claimant became, over decades, the infrastructure of algorithmic consciousness.
We are not learning to “use tools.” We are learning to inhabit the legal and monetary logic of 1968, now scaled to subjectivity. We are developing the folk-theories, the qualculations, and the strategies not to express our interiority, but to remain eligible for the New Property of digital life.
Physical. Symbolic. Constitutive.
Oracy. Literacy. Algorithmacy.
The task is not to mourn the death of the enlightened subject. That subject died with the Gold Standard. The task is to understand the survival strategies of the Co-opted Subject.
That competency is Algorithmacy.
Thank you.
Anticipated Questions
“How is algorithmacy different from algorithmic literacy?”
Algorithmic literacy applies the autonomous model: teach people how algorithms work and they will navigate them. Algorithmacy insists on the ideological model: the competency is always embedded in the power relations of the triadic structure producing it, following Street’s 1984 framework. Chung found in 2025 that users with higher algorithmic knowledge were less likely to correct misinformation, not more. Knowledge of the intermediary does not predict the ability to coordinate through it. Teaching someone how ATS parsing works is literacy. What happens to their cognition after 200 applications is algorithmacy.
“The 1968 genealogy is interesting but isn’t it a stretch? Reich was writing about welfare benefits, not algorithms.”
The parallel is structural, not technological. The Owner–Thing dyad was replaced by the Claimant–Administrator–Access triad. The administrative intermediary processing eligibility criteria that the claimant could not fully see or control is structurally identical to the algorithmic intermediary processing scoring criteria that the user cannot see or control. Rahman showed in 2018 that platforms now function as new utilities controlling the terms of access to vital services. The technology changed. The triadic structure did not. AI industrialized what administrative law institutionalized.
“Isn’t this technological determinism?”
The 1968 genealogy is the explicit answer. The triadic structure preceded the technology by decades. AI is the accelerant, not the cause. The printing press scaled Sumerian administrative logic but did not invent it. AI scales the administrative-state logic of 1968 but did not invent it. The structure is institutional. The technology makes it universal.
“Walk me through the three registers concretely.”
Physical register, oracy: I hand you the banana. The coordination object and the communication about it share the same space. Both parties are present. Both observe the transaction. Communication and coordination are coextensive.
Symbolic register, literacy: I write you a promissory note for the banana. The symbol separates from the referent. Coordination operates at distance. But both parties can inspect the mediating artifact. The note does not change between writing and reading.
Constitutive register, algorithmacy: I tap a button. An algorithm matches me with a provider, sets the price, routes the delivery, and evaluates the outcome. The banana still arrives. But the communicative act did not represent the transaction. It brought the transaction into existence. The intermediary transforms the communication between input and output. Neither party controls or fully observes the transformation. The communication is the coordination.
The progression is not concrete to abstract to immaterial. It is physical to symbolic to constitutive. The object persists. The relationship between communication and coordination inverts.
“What about Ulmer’s electracy?”
The same terminological move: a new -acy term after orality and literacy. Ulmer proposed it in 2003. The debt is real. Five axes of differentiation. Scope: electracy covers a 300-year apparatus shift; algorithmacy targets the triadic structure specifically. Level: electracy is a civilizational condition; algorithmacy is a cognitive competency. Mechanism: electracy draws on Althusserian interpellation; algorithmacy specifies co-optation with empirical evidence. Orientation: electracy is grounded in aesthetics and affect; algorithmacy in opacity and recursive feedback. Testability: algorithmacy generates predictions measurable by instrument.
“What about Aneesh’s algocracy?”
Aneesh proposed in 2009 that algocratic governance coordinates through code rather than through rules or prices. Danaher extended it in 2016, identifying a legitimacy deficit in algocratic systems. Algocracy describes the governance structure of the triad from above. Algorithmacy names the cognitive competency the triad produces from below. The ATS is an algocratic system. Algorithmacy is what the candidate develops by navigating it.
“What about algorithmic governmentality?”
Rouvroy and Berns argued in 2013 that algorithmic governance bypasses the subject entirely, managing populations through data accumulation, automated knowledge production, and preemptive intervention without reference to individual meaning. That framework describes the conditions under which algorithmacy develops. It cannot name the cognitive response. And it cannot explain variance: why two candidates with identical qualifications produce different outcomes navigating the same ATS. Variance requires a concept of competency.
“What about the secondary orality objection?”
Logan argued in 2010 that digital media produce a secondary orality characterized by communal participation and formulaic expression. Mir pushed this further in 2023, arguing that platforms restore speech-like affective immediacy as the dominant communication mode. The surface resemblance is real. Algospeak parallels oral formulas. But in primary orality, formulaic structures emerged from memory constraints in a dyad. In algorithmacy, they emerge from detection avoidance in a triad. The recursive feedback loop — where user behavior trains the system evaluating subsequent behavior — has no oral-culture parallel. Oral cultures did not rebuild their communicative environment through the act of communicating.
“What about folk theories of algorithms?”
Folk theorizing is algorithmacy’s signature cognitive operation. DeVito documented in 2021 how users build theories arranged in a hierarchy from functional awareness to structural causal models, in continuous adaptive revision. Folk theorization is positioned as an object of study. Algorithmacy positions the same process within a historical cognitive sequence. Folk theorizing is the characteristic operation of a third cognitive mode, comparable to mnemonic techniques in oral cultures.
Butler scholars: “This is performativity.”
Both performativity and co-optation describe constitutive processes. The difference is in the conditions. Butler’s citational chains are socially diffuse. Nobody designed heteronormativity. Nobody owns gender’s grammar. Co-optation operates through engineered systems with proprietary grammars the platform can update without notification. The conditions of co-optation change the conditions of resistance.
“Where is agency?”
Agency in algorithmacy is reactionary. The intermediary acts first; the subject responds. Cameron documented in 2022 how ridehailing drivers display sophisticated strategic agency, but that agency was constituted by the system it navigates. Co-optation produces capacity and capture simultaneously.
“Who is ‘the user’? This erases differential constitution.”
The triadic structure operates differently across social positions. ATS systems penalizing employment gaps affect caregivers disproportionately. Content moderation flagging Black vernacular restructures conditions before certain populations begin. Street argued in 1984 that what counts as literate practice is defined by the dominant group, and acquisition is inseparable from social conditions. The framework requires continued specification along lines of race, gender, and coloniality.
Heideggerians: “Your account is Stieglerian, not Heideggerian.”
Grammatization is the mechanism: progressive discretization, tertiary retention as externalized memory. Standing-reserve is the deeper framing: a mode of revealing, Gestell as the regime under which everything appears as resource. Heidegger’s point was that technology does not merely use the Rhine; it reveals the river as energy supply. Whether that vocabulary accommodates engineered worlds is a question I raise rather than settle.
“What about Deleuze’s societies of control in more detail?”
Deleuze argued in 1992 that enclosed disciplinary institutions are being replaced by modulating mechanisms of continuous control. Individuals give way to dividuals, masses give way to samples, data, markets, or banks. The 1968 genealogy adds the institutional origin: the dividual is the necessary unit of exchange for the New Property. Algorithmacy adds the cognitive competency the dividual develops to navigate modulation. The Postscript described the condition. Algorithmacy names the response.
“Is algorithmacy testable?”
Yes. Users with greater algorithmacy should demonstrate abstract reasoning about algorithmic behavior, cross-platform transfer of coordination competence, and predictive accuracy about system responses. Dogruel, Masur, and Joeckel validated a psychometric scale for algorithmic literacy in 2022. My research program extends that approach to algorithmacy, measuring folk-theorizing capacity, qualculation, cross-platform transfer, and anticipatory self-quantification.
“What about LLMs specifically?”
The Jakesch latent persuasion finding is in the body of the talk. When a lawyer uses an LLM to draft a brief or a student writes a paper through one, each navigates a triadic structure. Nobody taught millions of people to prompt effectively. They learned through practice. Co-optation producing competency in real time. And increasingly, the ATS is an LLM. The hiring triad becomes more opaque, more adaptive, and more constitutive with every generation.
“What about the postmodernism claim? Isn’t that too strong?”
The claim is structural, not dismissive. Postmodernism performed a necessary demolition of the autonomous Enlightenment subject. But demolition is not construction. The “death of the author,” the “decentered subject,” the “end of grand narratives”: these describe what was lost when the dyadic Owner-Subject was dismantled. They do not describe what was built in its place. Coordination theory and platform studies provide the reconstruction: the triadic structure, the co-optation mechanism, and the cognitive competency that develops through navigating it. Törnberg’s 2023 analysis supports this periodization, theorizing a transition from postmodernity toward automated consumer culture in which algorithmic systems supersede postmodern conditions.
Quotations Referenced: Context and Full Citations
Every direct quotation used in the talk, with context and complete bibliographic reference.
- “restructure consciousness”
Ong’s central thesis in Orality and Literacy is that transitions between communication technologies do not merely add tools to existing cognitive repertoires. They reorganize the fundamental operations of thought. He documented nine specific characteristics of oral thought that writing displaced, including additive rather than subordinative reasoning and situational rather than abstract conceptualization.
Ong, W. J. (1982). Orality and Literacy: The Technologizing of the Word. Routledge. (p. 78) - “the separation of the knower from the known”
Havelock argued that the Greek alphabet, by externalizing memory into a stable visual form, made possible a new cognitive relationship in which the thinker could stand apart from the thought. This separation was the prerequisite for Platonic philosophy and the analytic reasoning tradition that followed from it.
Havelock, E. A. (1963). Preface to Plato. Harvard University Press. (p. 215) - “each of the two feels himself confronted only by the other, not by a collectivity above him”
Simmel’s analysis of the dyad emphasized its radical dependency on both members. The dyad has no supra-individual structure; it exists only in the mutual orientation of two individuals. This is what makes the transition to the triad qualitatively transformative rather than merely additive.
Simmel, G. (1950). The Sociology of Georg Simmel (K. H. Wolff, Ed. & Trans.). Free Press. (p. 122) - “The purpose of the earliest tablets was not to record language”
Nissen, Damerow, and Englund’s study of the Uruk tablets demonstrated that proto-cuneiform was a system of numerical notation and administrative record-keeping, not a transcription of speech. Writing originated as an accounting technology, and the linguistic applications came centuries later.
Nissen, H. J., Damerow, P., & Englund, R. K. (1993). Archaic Bookkeeping: Early Writing and Techniques of Economic Administration in the Ancient Near East. University of Chicago Press. (p. 36) - Writing served exclusively administrative functions for five hundred years
Schmandt-Besserat’s archaeological analysis traced the evolution from clay tokens to inscribed tablets, showing that the transition to writing was driven entirely by the demands of economic administration in increasingly complex Sumerian temple economies.
Schmandt-Besserat, D. (1996). How Writing Came About. University of Texas Press. - “The institution called property guards the troubled boundary between individual man and the state”
Reich’s article, published in the Yale Law Journal, argued that twentieth-century wealth increasingly took the form of government largess: licenses, franchises, contracts, subsidies, and benefits. These forms of wealth were not property in the traditional sense but conditional privileges that could be revoked by administrative discretion, creating a dependency relationship between individual and state.
Reich, C. A. (1964). The new property. Yale Law Journal, 73(5), 733–787. (p. 733) - “new feudalism”
Reich’s metaphor for the condition in which individuals depend on maintaining administrative eligibility rather than owning assets outright. The parallel to feudalism was that survival depended on maintaining a relationship with a more powerful entity whose terms one could not unilaterally set.
Reich, C. A. (1964). The new property. Yale Law Journal, 73(5), 733–787. (p. 769) - “each of them relates to a third quantity”
Simmel’s analysis of money in The Philosophy of Money identified exchange as a triadic rather than dyadic structure. Objects are not compared directly to each other but are each measured against money as a third term. This insight anticipated the triadic logic that the fiat transition made structurally explicit.
Simmel, G. (2004). The Philosophy of Money (D. Frisby, Ed.; T. Bottomore & D. Frisby, Trans.). Routledge. (Original work published 1907) (p. 121) - Money is “a social relation” rather than a commodity; treating it as a physical object is a “category error”
Ingham’s critique targeted both mainstream economics and commodity theories of money, arguing that money is constituted by social and institutional relationships, not by any intrinsic property of the medium. The category error consists in treating a social institution as if it were a natural object.
Ingham, G. (2004). The Nature of Money. Polity Press. (pp. 12, 70) - Automated systems “terminate Medicaid to cancer patients and deny food stamps”
Citron and Calo documented how automated decision-making systems in the administrative state replicate and accelerate the discretionary authority that Reich identified, now operating without meaningful human oversight or due process protections.
Citron, D. K., & Calo, R. (2021). The automated administrative state: A crisis of legitimacy. Emory Law Journal, 70(4), 797–845. - Platforms as “new utilities” controlling “the terms of access to vital services”
Rahman revived the Progressive Era concept of public utilities to argue that digital platforms exercise the kind of infrastructural power over economic life that historically triggered public utility regulation. His analysis connected platform power to the tradition of common carrier obligations.
Rahman, K. S. (2018). The new utilities: Private power, social infrastructure, and the revival of the public utility concept. Cardozo Law Review, 39, 1621–1689. - “the costs of organising an extra transaction within the firm become equal to the costs of carrying out the same transaction” through the price mechanism
Coase’s theory of the firm explained the existence of firms as a response to market transaction costs. The firm’s boundary sits where internal coordination costs equal the costs of using the price mechanism. This remains the foundational text for understanding why coordination takes different institutional forms.
Coase, R. H. (1937). The nature of the firm. Economica, 4(16), 386–405. (p. 395) - Networks as “a qualitatively distinct” governance logic
Powell argued against the prevailing view that networks were simply hybrids of markets and hierarchies. He identified a distinct logic of exchange rooted in reciprocity, trust, and relational obligation that could not be reduced to either price signals or administrative authority.
Powell, W. W. (1990). Neither market nor hierarchy: Network forms of organization. Research in Organizational Behavior, 12, 295–336. - “Whereas actors in hierarchies command, in markets they contract, and in networks collaborate, on platforms they are co-opted”
Stark and Vanden Broeck’s identification of co-optation as the fourth coordination mechanism is the theoretical foundation of my argument. They analyzed how platforms organize economic activity through a structurally distinct form that cannot be reduced to market, hierarchy, or network logics.
Stark, D., & Vanden Broeck, P. (2024). Principles of algorithmic management. Organization Theory, 5(2), 1–24. - Platforms generate coordination “without requiring traditional prerequisites”
Vallas and Schor reached the same structural conclusion as Stark and Vanden Broeck through independent analysis of the gig economy, arguing that platforms constitute a governance form distinct from the classical coordination mechanisms.
Vallas, S. P., & Schor, J. B. (2020). What do platforms do? Understanding the gig economy. Annual Review of Sociology, 46, 273–294. - Co-optation as absorption of opposition through enrollment
Selznick’s study of the TVA introduced co-optation into organizational theory. The TVA enrolled local elites into its governance structure, giving them formal advisory roles that simultaneously diffused their opposition and aligned their interests with the federal program.
Selznick, P. (1949). TVA and the Grass Roots: A Study of Politics and Organization. University of California Press. - Algorithms are “performative due to the extent to which their use can shape and alter work and organizational realities”
Faraj, Pachidi, and Sayegh identified the recursive relationship between algorithmic systems and the organizational environments they operate within, arguing that algorithms do not merely measure or describe work but actively reconstitute it through use.
Faraj, S., Pachidi, S., & Sayegh, K. (2018). Working and organizing in the age of the learning algorithm. Information and Organization, 28(1), 62–70. (p. 64) - 62.5% of Facebook users “were not aware of the News Feed curation algorithm’s existence at all”
Eslami and colleagues conducted a controlled study revealing that the majority of Facebook users did not know their News Feed was algorithmically curated. When informed, users expressed surprise and revised their interpretations of their social relationships, having attributed the visibility or invisibility of friends’ posts to the friends’ behavior rather than to algorithmic selection.
Eslami, M., Rickman, A., Vaccaro, K., Aleyasen, A., Vuong, A., Karahalios, K., Hamilton, K., & Sandvig, C. (2015). “I always assumed that I wasn’t really that close to [her]”: Reasoning about invisible algorithms in news feeds. CHI 2015, 153–162. - Ridehailing drivers distinguishing “relational” from “efficiency” strategies
Cameron’s ethnography documented how ridehailing drivers develop sophisticated strategic orientations through platform participation, with some drivers prioritizing relationship-building with passengers and others optimizing purely for algorithmic efficiency metrics.
Cameron, L. (2022). “Making out” while driving: Relational and efficiency games in the gig economy. Organization Science, 33(1), 231–252. - “Qualculation”
Shapiro coined the term to describe a reasoning style among gig economy workers that blends affective intuition with strategic calculation. It is distinct from the rational-actor model that platform design assumes and represents a genuinely novel cognitive orientation produced through platform participation.
Shapiro, A. (2018). Between autonomy and control: Strategies of arbitrage in the “on-demand” economy. New Media & Society, 20(8), 2954–2971. - Folk theories arranged from functional awareness to structural causal models, in “continuous adaptive revision”
DeVito conducted a seven-week qualitative study of LGBTQ+ platform users, documenting how they developed increasingly sophisticated working models of algorithmic behavior through iterative cycles of observation, hypothesis formation, testing, and revision.
DeVito, M. A. (2021). Adaptive folk theorization as a path to algorithmic literacy. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), Article 339, 1–35. - “Enclosures are molds, distinct castings, but controls are a modulation”
Deleuze’s Postscript argued that the disciplinary society Foucault described, organized around institutions of enclosure (school, factory, prison), was giving way to a society of continuous control operating through modulation rather than confinement. The essay has become a foundational reference for theorizing digital governance.
Deleuze, G. (1992). Postscript on the societies of control. October, 59, 3–7. (p. 4) - The “dividual”: masses decomposed into “samples, data, markets, or ‘banks’”
Deleuze’s concept of the dividual described the decomposition of the individual into data points that can be recombined, sorted, and managed independently. The term anticipated by decades the data-driven subject-formation that platform studies now document empirically.
Deleuze, G. (1992). Postscript on the societies of control. October, 59, 3–7. (p. 5) - Grammatization as discretization of experience into elements “stored, reproduced, and manipulated by technical systems”; digital technology as “the last stage of the grammatization process, which started with writing”
Stiegler extended Derrida’s concept of grammê into a historical theory of technical inscription. Grammatization names the process by which continuous human experience is progressively broken into discrete, reproducible units by technical systems, from alphabetic writing through industrial machinery to digital code.
Stiegler, B. (2010). For a New Critique of Political Economy. Polity Press. (pp. 34, 36) - Algorithms construct identity as “measurable types,” “defining the meaning of categories while producing them”
Cheney-Lippold’s analysis showed how algorithmic classification systems do not discover pre-existing identity categories but constitute them through statistical operations. The algorithmic definition of “male” or “Hispanic” or “creditworthy” is produced by the classification process itself.
Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society, 28(6), 164–181. - Dividuals as “the decomposition of individuals into data clouds subject to automated integration and disintegration”
Cheney-Lippold extended his 2011 framework into a book-length analysis of how algorithmic systems constitute identity through continuous data processing, arguing that the concept of the autonomous individual becomes incoherent under conditions of pervasive algorithmic classification.
Cheney-Lippold, J. (2017). We Are Data: Algorithms and the Making of Our Digital Selves. NYU Press. (p. 5) - The Rhine “set upon to yield energy that can be extracted and stored”
Heidegger’s Rhine example illustrates standing-reserve: under the regime of modern technology, the river is no longer encountered as a river but as a resource for hydroelectric power. The example demonstrates how Gestell, enframing, transforms the mode of revealing such that everything appears as available for optimization.
Heidegger, M. (1977). The question concerning technology. In The Question Concerning Technology and Other Essays (W. Lovitt, Trans.). Harper & Row. (Original work published 1954) (p. 16) - “latent persuasion by language models”
Jakesch and colleagues ran a randomized controlled experiment in which participants co-wrote short essays with AI assistants that had been given opinionated system prompts. Users exposed to opinionated models shifted both their writing and their subsequently reported opinions, demonstrating that triadic interaction with an algorithmic intermediary can reshape cognitive orientation through the practice of use.
Jakesch, M., Bhat, A., Buschek, D., Zalmanson, L., & Naaman, M. (2023). Co-writing with opinionated language models affects users’ views. CHI 2023, 1–15. - Competencies “embedded in power relations”; literacy defined by dominant group, acquisition inseparable from social conditions
Street’s ideological model of literacy opposed the autonomous model, which treats literacy as a neutral, transferable, context-independent skill. Street demonstrated through ethnographic work in Iran that what counts as literacy is always determined by power relations, and that acquisition cannot be separated from the institutional and social conditions in which it occurs.
Street, B. V. (1984). Literacy in Theory and Practice. Cambridge University Press. - AAVE flagged as “offensive” at 1.5 to 2 times higher rates
Sap and colleagues demonstrated systematic racial bias in automated hate speech detection systems, finding that tweets written in African American Vernacular English were significantly more likely to be classified as offensive or hateful, even when the content was benign. The finding illustrates how algorithmic systems impose differential conditions on the development of algorithmacy across racial lines.
Sap, M., Card, D., Gabriel, S., Choi, Y., & Smith, N. A. (2019). The risk of racial bias in hate speech detection. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics (ACL 2019), 1668–1678. - Algorithmic competency depends on social support networks and cognitive job crafting
Zhou and colleagues studied on-demand labor platform workers, finding that algorithmic competency develops not through individual effort alone but through access to peer networks and the practice of actively redefining one’s relationship to the algorithmic system. The uneven distribution of these resources means competency development is stratified by social position.
Zhou, R., et al. (2025). Algorithmic competency of on-demand labor platform workers. Asia Pacific Journal of Human Resources. - “Success Stories,” “Strugglers,” and “Strivers”
Ravenelle’s ethnography of gig workers in New York City identified three typologies that emerged through differential platform participation rather than pre-existing personality traits. Success Stories brought existing social and economic capital; Strugglers lacked alternatives and were progressively absorbed by platform logic; Strivers maintained dual participation in platform and traditional economies.
Ravenelle, A. J. (2019). Hustle and Gig: Struggling and Surviving in the Sharing Economy. University of California Press. - Algorithmic management as “a new contested terrain of control”
Kellogg, Valentine, and Christin reviewed the organizational literature on algorithmic management and identified it as a fundamentally new form of workplace control that does not merely direct existing workers but creates new occupational categories, skill definitions, and behavioral patterns that did not exist prior to the platform.
Kellogg, K. C., Valentine, M. A., & Christin, A. (2020). Algorithms at work: The new contested terrain of control. Academy of Management Annals, 14(1), 366–410. - Transition “from postmodernity toward automated consumer culture”
Törnberg theorized that algorithmic platforms are not extensions of postmodern conditions but represent a qualitatively new phase in which automated systems supersede the fragmentation and irony characteristic of postmodernity, producing instead a regime of behavioral prediction and modulation.
Törnberg, P. (2023). How platforms govern: Social regulation in digital capitalism. Big Data & Society, 10(1). - Higher algorithmic knowledge predicted less effective navigation
Chung’s study of young adults found a paradoxical relationship between algorithmic knowledge and corrective behavior. Users who scored higher on measures of algorithmic awareness were less, not more, likely to take action to correct misinformation. The finding undermines the knowledge-deficit model underlying algorithmic literacy approaches.
Chung, M. (2025). When knowing more means doing less: Algorithmic knowledge and digital (dis)engagement among young adults. Harvard Kennedy School Misinformation Review. - Algorithmic governmentality as “data accumulation, automated knowledge production, and preemptive intervention”
Rouvroy and Berns argued that algorithmic governance represents a new form of power that bypasses individual subjectivity entirely, operating through statistical patterns in aggregate data rather than through the disciplining or interpellation of individual subjects.
Rouvroy, A., & Berns, T. (2013). Algorithmic governmentality and prospects of emancipation. Réseaux, 177(1), 163–196. - Algocratic governance as coordination through code
Aneesh proposed algocracy as a third mode of governance alongside bureaucracy and market governance, in which coordination is achieved through the constraints embedded in programming code rather than through legal-rational rules or price signals.
Aneesh, A. (2009). Global labor: Algocratic modes of organization. Sociological Theory, 27(4), 347–370. - Algocratic legitimacy deficit
Danaher extended Aneesh’s framework to identify a democratic legitimacy problem: algocratic systems exercise authority over individuals without satisfying standard conditions of democratic legitimacy such as transparency, accountability, or consent.
Danaher, J. (2016). The threat of algocracy: Reality, resistance and accommodation. Philosophy & Technology, 29(3), 245–268. - Electracy as civilizational apparatus
Ulmer proposed electracy as the third term in a civilizational sequence: orality, literacy, electracy. He mapped each across institutional, epistemological, and aesthetic dimensions. Electracy is grounded in entertainment and aesthetics rather than in religion (orality) or science (literacy).
Ulmer, G. L. (2003). Internet Invention: From Literacy to Electracy. Longman. - “Secondary orality” and digital media
Logan extended McLuhan’s and Ong’s frameworks to argue that digital media are producing a return of oral-culture characteristics: communal participation, simultaneity, formulaic expression, and affective immediacy.
Logan, R. K. (2010). Understanding New Media: Extending Marshall McLuhan. Peter Lang. - “Digital orality” as the return of speech
Mir argued that digital platforms are restoring speech-like patterns as the dominant mode of communication, reversing the literate era’s privileging of written, analytical, individual discourse.
Mir, A. (2023). Digital orality: The return of speech. Substack. - Psychometric scale for algorithmic literacy
Dogruel, Masur, and Joeckel developed and validated a 22-item scale measuring algorithmic literacy as awareness and knowledge of algorithms. The scale provides the methodological precedent for the psychometric instrument my research program is developing for algorithmacy.
Dogruel, L., Masur, P., & Joeckel, S. (2022). Development and validation of an algorithm literacy scale. Communication Methods and Measures, 16(2), 115–133.
Roger Hunt