We're in the Matrix, But Not THAT Matrix
A Pop Culture Association 2026 talk arguing that Cypher, not Neo, best captures contemporary algorithmic coordination — from four futures to live triads to the emergent competency of algorithmacy.
I. Four Futures
The Matrix gives us (at least) four futures.
Life inside the simulation: brains in vats, false consciousness, Plato in sunglasses.
Humans as batteries: machine domination, thermodynamic complaints.
Humans as resistance fighters navigating a system they caused but did not build and cannot escape.
And whatever is happening in the fourth movie, which I have started three times and still not finished.
The resistance fighter version is the one that matters here, because it is there that the problem of mediation becomes visible.
Neo does not represent us. Cypher does. He sits at an operator console reading raw code drizzling down the screen. Neo asks if he always looks at it encoded. Cypher explains: “The image translators sort of work for the construct programs but there’s way too much information to decode the Matrix. You get used to it, though. Your brain does the translating. I don’t even see the code. All I see is blonde, brunette, redhead.”
Nobody taught Cypher to read the code. No one offered a course. Nine years inside the system, and his brain learned to translate because the system required translation. He pours Neo a drink and confesses: “I know what you’re thinking, ‘cause right now I’m thinking the same thing. Actually, I’ve been thinking it ever since I got here: Why oh why didn’t I take the blue pill?”
Neo escapes. Neo transcends. Neo becomes the One.
Nobody becomes the One driving for Uber.
Cypher is the figure for a world where the intermediary is not going away. He reads the system. The system reads him. He learned to navigate it by living inside it long enough that its logic became second nature.
Not simulation. Not domination. Not literacy. Triadic Mediation, or what I’ll call “algorithmacy.”
II. The Argument
If triadic structures do not reduce to dyadic structures, then dyadic framings are insufficient wherever triadic structures govern coordination.
I will make the case for P,
show you some live triads,
then talk about where this reality heads: just as we moved from oracy to literacy, we are now moving from literacy to algorithmacy.
I’ll explore (briefly) Peirce, Royce, and Simmel, thinking about triads.
Then some live triads like Uber, applying for jobs, cartels.
And finally, ground this in my dissertation, which builds the construct: algorithmacy.
III. The Case for P
Current attempts to understand interacting with AI are fundamentally dyadic: AI Literacy, Digital Literacy, Algorithmic Literacy. Each assumes a user facing a system. Two parties. The question is whether that geometry is sufficient.
Can the third be reduced to a relation between two terms, or does it introduce a distinct form?
Peirce showed that meaning is irreducibly triadic. Semiosis requires sign, object, and interpretant. Remove any one and the process that produces meaning collapses. This is not a philosophical preference. It is a structural claim: you cannot reconstruct the interpretive process from pairs. A sign related to an object without an interpretant is not a sign. An interpretant without a sign is not an interpretation. The triad is the minimum unit that produces meaning at all.
Simmel made the same point sociologically. The move from two to three is not arithmetic. It is a change in kind. Two people can only agree or disagree. Add a third and entirely new structural possibilities appear: mediation, coalition, opacity, leverage, the casting vote, the secret, the broker. None of these exist in the dyad. None can be derived from it. The triad is not a more complicated dyad. It is a different social form.
Royce extended the insight into community. Interpretation, for Royce, is not something one mind does alone. It requires a mediating third: someone or something that stands between two parties and makes their coordination possible. Community itself depends on interpretation, which means community is logically triadic. No dyad of individuals produces community without a mediating structure that connects them.
The opening scene of The Matrix puts all three thinkers on screen at once. Trinity and Cypher are on a phone call. “Is everything in place?” Trinity asks. Cypher taunts her: “You like him, don’t you? You like watching him.” Then: “We’re gonna kill him. You understand that?” Neo, still jacked in, has no idea he is being discussed. Trinity reads the code as surveillance. The Agents read the trace as a target. Neo experiences ordinary life. Three parties, and no pair has full visibility into what the other two are doing. That is Peirce’s irreducible triad. That is Simmel’s change in kind. That is Royce’s community making coordination possible while remaining opaque to the parties it connects.
Communication theory has produced a series of frameworks trying to describe this situation. Each one preserves the dyad rather than advancing the triad.
Computer-Mediated Communication treats technology as a channel between two humans.
Human-Machine Communication makes the machine an interlocutor rather than a channel.
Human-Computer Interaction studies how people use and are affected by computing technology, but frames the relationship as one user working with one system.
Hybrid Cognitive Alignment, the most recent entry from the management literature, theorizes mutual adaptation between one human and one AI during task collaboration (Lu & Yan, 2026).
Each framework captures something. None accommodates an intermediary with its own optimization objectives coordinating multiple parties whose interests diverge.
So what do triads look like?
IV. Live Triads
When you request a ride, an algorithm matches you with a driver, sets the price, routes the vehicle, and evaluates the outcome. Rider and driver deal through a third that shapes who gets matched, at what price, and who bears the risk. Experienced drivers learn to read the system: when to accept, where to position, what the algorithm rewards. Identical algorithms produce non-identical outcomes depending on who navigates them. Really cool public dataset out of Chicago!
When you apply for a job, your resume passes through an applicant tracking system before a human sees it. The system parses, scores, ranks, and filters. The hiring manager sees the output, not the input. The candidate addresses the system in hopes of reaching the human. Two parties, an active intermediary, and neither side fully sees what the intermediary does.
Oswaldo Zavala made the same structural point about Drug Cartels! The standard narrative of the Mexican drug war is dyadic: cartels versus the state. Zavala argues the geometry is wrong. What organizes the drug trade is a triadic structure: politicians, military and security forces, dealers, assassins, lords, families, economic interests. The “cartel” is a label that obscures the mediating third. When you frame a triad as a dyad, you target a geometry that does not exist.
Cypher figured this out. When he sits across from Agent Smith at the restaurant, he is not opposing the system. He is negotiating with it. He cuts into a steak and says: “I know this steak doesn’t exist. I know that when I put it in my mouth, the Matrix is telling my brain that it is juicy and delicious. After nine years, you know what I realize? Ignorance is bliss.”
The standard reading is moral: Cypher is the traitor. The structural reading is more interesting. He has full knowledge of the mediation. He is not deceived. He is co-opted. The system trained him to prefer its outputs. Now he negotiates the terms of his own re-enrollment directly with the intermediary. Co-optation, recently formalized by David Stark as a fourth coordination mechanism alongside hierarchy, markets, and networks, creates the conditions of possibility for this transition.
The moral framing, traitor versus loyalist, is the dyadic narrative skewering our ability to explore the triadic structure.
V. The Transition
How oracy developed into literacy is well documented, and serves as our model for the transition from literacy to algorithmacy. Oral cultures coordinated through speech, memory, and presence. Literate cultures coordinated through written records, codified law, and institutional memory. That transition did not add a skill. It reorganized what kinds of institutions could exist.
Socrates argued in the Phaedrus that writing would destroy memory and produce the mere appearance of wisdom. He was not entirely wrong. But what replaced oral memory was not a degradation. It was a different form of coordination: bureaucracy, contract law, narrative that persists across generations.
However, literacy began much earlier as accounting. Around 3200 BCE, Sumerians were tallying grain. It wasn’t until Gilgamesh in 2100 BCE that we got written narrative. Until the printing press turned literacy into a requirement for navigating the world, literacy was limited to priests and specialists.
The transition from oracy to literacy offers a model, not an equivalence, for understanding how literate coordination transitions to algorithmically mediated coordination, or algorithmacy.
These are distinct competencies in distinct registers.
Oracy works in a physical register: I hand you the thing.
Literacy works in a symbolic register: I write you the note.
Algorithmacy works in a constitutive register: the communicative act helps bring the transaction into existence.
The intermediary transforms communication between input and output. That transformation is co-optation. The human competency—if literacy and oracy are competencies—is algorithmacy.
The institutional conditions for that third register were emerging well before AI. In 1964, Charles Reich argued that government largess had become a new form of property, with the state standing between citizens and their livelihoods as an active intermediary. In 1971, Nixon closed the gold window and the last dyadic anchor in the global monetary system collapsed. Fiat replaced it: value constituted entirely by institutional intermediaries with their own logic and opacity.
AI is not where algorithmacy begins. AI is where algorithmacy becomes impossible to ignore, as the printing press made literacy impossible to ignore.
When Cypher seizes the operator console and pulls plugs, Apoc and Switch die. Trinity screams at him. Cypher responds: “Don’t hate me Trinity… I’m just the messenger.” It is a precise lie. A messenger is a channel: a dyadic figure who carries content without altering it. Cypher is the opposite. He is an agent exploiting a triadic structure. His claim to be “just the messenger” is the dyadic narrative applied to a triadic reality.
Trinity tells him the Matrix is not real. Cypher replies: “I disagree, Trinity. I think that the Matrix can be more real than this world.” He is not confused. He is stating the structural claim of this entire talk. The constitutive register produces experiences that are, for the participant, indistinguishable from unmediated ones. And once coordination moves into the intermediary, there is no safe unmediated fallback.
And when Tank rises wounded and kills Cypher, the film delivers the final structural point. Maximum individual fluency is not enough. Triadic structures generate coalition possibilities that no single participant can fully control. Cypher read the system better than anyone on that ship. He still lost.
If individual fluency is inherently limited, then the question becomes: can shared fluency produce coordination outcomes that individual fluency cannot?
Can shared algorithmic coordination among strangers match or exceed that achieved through direct communication?
Oracy. Literacy. Algorithmacy.
The task is not to unplug from the Matrix. The task is to understand the third: what Cypher learned, how triadic coordination teaches it, and who gets to learn it on unequal terms.
There is no spoon. But there is a third. And it changes everything.
Thank you.
Anticipated Questions
“How is algorithmacy different from algorithmic literacy?”
Algorithmic literacy assumes the autonomous model: teach people how algorithms work and they will navigate them better. Algorithmacy insists that the competency is always embedded in the power relations of the triadic structure producing it. You cannot separate the skill from the structure.
“What about Ulmer’s electracy?”
The debt is real. Ulmer proposed electracy in 2003 as a third apparatus addressing a broader civilizational shift across electronic media. Algorithmacy targets the specific competency produced by triadic coordination in a world where the intermediary learns from you while you learn to navigate it.
“Is this just prompt engineering?”
Reducing algorithmacy to prompt engineering is like reducing literacy to penmanship. Algorithmacy transfers across platforms, interfaces, and interaction types.
“Isn’t this technological determinism?”
The triadic structure predates digital technology. Selznick documented co-optation in 1949. Simmel theorized the triad in 1908. The Sumerians had scribes. Digital technology industrialized the structure. It did not invent it.
“Where is agency?”
Reactive. The intermediary acts first; the subject responds. Co-optation produces capacity and capture simultaneously. Agency exists. It is not autonomous.
“Is algorithmacy testable?”
Yes. My dissertation’s research question: can shared literacy in an algorithmically mediated communication system produce coordination among strangers that matches or exceeds coordination achieved through direct communication?
“What do you make of Zavala’s own framing? He doesn’t use Peirce or Simmel.”
Correct. Zavala frames it as ideology critique. My point is that the structural insight is deeper than the framing. What he describes is an irreducible triad being systematically misread as a dyad.
Full Citations
- Cameron, L. D. (2022). “Making out” while driving. Organization Science, 33(1), 231–252.
- Cameron, L. D. (2024). The making of the “good bad” job. Administrative Science Quarterly, 69(2).
- DeVito, M. A. (2021). Adaptive folk theorization. PACM HCI, 5(CSCW2), Art. 339.
- Eslami, M., et al. (2015). Reasoning about invisible algorithms. CHI 2015, 153–162.
- Goody, J. (1986). The Logic of Writing and the Organization of Society. Cambridge University Press.
- Guzman, A. L., & Lewis, S. C. (2020). AI and communication. New Media & Society, 22(1), 70–86.
- Ong, W. J. (1982). Orality and Literacy. Routledge.
- Peirce, C. S. (1931–1958). Collected Papers of Charles Sanders Peirce (C. Hartshorne, P. Weiss, & A. Burks, Eds.). Harvard University Press.
- Plato. (c. 370 BCE). Phaedrus (A. Nehamas & P. Woodruff, Trans.). Hackett Publishing. (Translation published 1995)
- Reich, C. A. (1964). The new property. Yale Law Journal, 73(5), 733–787.
- Royce, J. (2001). The Problem of Christianity (F. M. Oppenheim, Introduction). Catholic University of America Press. (Original work published 1913)
- Ruggie, J. G. (1982). International regimes, transactions, and change: Embedded liberalism in the postwar economic order. International Organization, 36(2), 379–415.
- Schmandt-Besserat, D. (1992). Before Writing: Vol. I. From Counting to Cuneiform. University of Texas Press.
- Selznick, P. (1949). TVA and the Grass Roots. UC Press.
- Short, T. L. (2007). Peirce’s Theory of Signs. Cambridge University Press.
- Simmel, G. (1950). The Sociology of Georg Simmel (K. H. Wolff, Trans. & Ed.). Free Press. (Original work published 1908)
- Stark, D., & Vanden Broeck, P. (2024). Principles of algorithmic management. Org Theory, 5(2), 1–24.
- Walther, J. B. (1996). Computer-mediated communication. Communication Research, 23(1), 3–43.
- Zavala, O. (2022). Drug Cartels Do Not Exist: Narcotrafficking in US and Mexican Culture (W. Savinar, Trans.). Vanderbilt University Press.
Roger Hunt