Data Centers, Public Legitimacy, and the Organizational Cost of Infrastructural Opacity
A quiet crisis is building at the physical foundation of the AI economy. Industry reporting this week highlights that data center executives are increasingly alarmed by deteriorating public sentiment toward their operations. Communities are organizing against new developments, politicians are responding with regulatory pressure, and analysts estimate that the reputational damage now threatens trillions of dollars in planned infrastructure investment (Data center executives fret, 2025). What makes this moment theoretically interesting is not the environmental or land-use conflict itself, but what the conflict reveals about a specific organizational failure: the systematic inability of large technical institutions to communicate the structural logic of their operations to non-expert publics.
This is not simply a public relations problem. It is a coordination failure with a well-documented cognitive structure.
The Asymmetry Between Technical Complexity and Public Schema
Data center operators built their legitimacy on a tacit assumption: that the economic benefits of cloud infrastructure would be sufficiently visible to communities hosting these facilities that explicit justification would be unnecessary. That assumption has collapsed. What data center executives are now discovering is that communities do not hold accurate structural schemas of how digital infrastructure relates to local economic outcomes, energy grids, or water systems. They hold folk theories, meaning individually constructed impressions assembled from fragmentary media exposure and direct local experience of noise, heat, and water consumption (Gagrain, Naab, & Grub, 2024).
The distinction matters enormously. A folk theory about a data center might register the heat output and electricity draw without connecting these costs to the distributed economic activity the facility enables. A structural schema would represent the relationship between those inputs and the outputs they sustain. The public's current model is topographic, meaning it maps visible surface features. It is not topological, meaning it does not represent the underlying constraint relationships. The result is that communities are making reasonable inferences from an incomplete map, and those inferences are increasingly adversarial.
Why Better Messaging Will Not Solve This
The industry's instinct will be to respond with communications campaigns. Executives will hire sustainability consultants, publish impact reports, and arrange community information sessions. There is substantial reason to believe this approach will underperform. Hancock, Naaman, and Levy (2020) distinguish between awareness of a system's operation and the cognitive capacity to reason accurately about its outputs. Awareness campaigns increase the former without reliably producing the latter. Telling a community that a data center "powers the digital economy" raises awareness of a relationship while providing no schema for evaluating its local costs and benefits.
This distinction maps directly onto what Hatano and Inagaki (1986) call the difference between routine and adaptive expertise. Routine expertise supports performance in stable, anticipated contexts. Adaptive expertise supports reasoning in novel or ambiguous situations. Community members encountering data center development for the first time need adaptive expertise about infrastructural tradeoffs. What they receive, typically, are procedural facts: jobs created, taxes paid, safety standards met. These facts do not induce schema. They produce, at best, temporary reassurance that decays when visible costs - grid strain, water use, noise - persist in everyday experience.
The Organizational Theory Connection
There is a deeper organizational problem here that the communications framing obscures. The data center industry developed in a period when infrastructural expansion was largely invisible to the populations it served. Hyperscale construction happened in remote areas, in industrial zones, under minimal public scrutiny. The operational logic of the industry was never stress-tested against a requirement for public legibility because no such requirement existed. The industry's internal competence is therefore genuinely high while its external communicative competence is structurally underdeveloped.
This is what Rahman (2021) identifies as the invisible cage dynamic in a different register: the constraining logic of a system becomes consequential precisely at the moment when those outside it must navigate its effects without access to its internal structure. Communities are now inside the effects of data center expansion without being inside the explanatory framework that makes those effects legible. The gap between those two positions is where the political opposition is organizing.
What an Accurate Diagnosis Implies
If the problem is schema deficit rather than information deficit, the intervention class changes substantially. The industry needs mechanisms for genuine schema induction in community contexts, meaning structured engagement that allows non-expert stakeholders to build accurate relational models of how these facilities operate and what tradeoffs they involve (Gentner, 1983). This is harder than issuing fact sheets. It requires organizational investment in external-facing communication that most technically-oriented firms have never built and do not know how to evaluate.
The data center legitimacy crisis is, in this reading, a case study in what happens when an industry's internal coordination competence and its external coordination competence develop at radically different rates. The industry knows how to build at hyperscale. It does not yet know how to explain at human scale. That gap is now a material business risk.
References
Data center executives fret over the industry's increasingly toxic public image. (2025). Retrieved from current news sources.
Gagrain, A., Naab, T., & Grub, J. (2024). Algorithmic media use and algorithm literacy. New Media & Society.
Gentner, D. (1983). Structure-mapping: A theoretical framework for analogy. Cognitive Science, 7(2), 155-170.
Hancock, J. T., Naaman, M., & Levy, K. (2020). AI-mediated communication: Definition, research agenda, and ethical considerations. Journal of Computer-Mediated Communication, 25(1), 89-100.
Hatano, G., & Inagaki, K. (1986). Two courses of expertise. In H. Stevenson, H. Azuma, & K. Hakuta (Eds.), Child development and education in Japan (pp. 262-272). Freeman.
Rahman, H. A. (2021). The invisible cage: Workers' reactivity to opaque algorithmic evaluations. Administrative Science Quarterly, 66(4), 945-988.
Roger Hunt